Original link:tecdat.cn/?p=16392

Original source:Tuo End number according to the tribe public number


 

For this example, I will model the time series in R. I keep the last 24 observations as a test set and will use the remaining observations to fit the neural network. Two types of neural networks are currently available, multilayer perceptrons; And extreme learning machines.

Plot (mlp.fit) print(mlp.fit) print(mlp.fit)Copy the code

This is the basic command to make the MLP network fit for time series. This will attempt to automatically specify the necessary preprocessing of autoregressive input and time series. Using pre-specified parameters, it trained 20 networks for generating overall predictions and a hidden layer with five nodes. Print is a summary of the output fitting network:

MLP fit with 5 hidden nodes and 20 repetitions. Series modelled in differences: D1. Univariate lags: (1,3,4,6,7,8,9,10) Deterministic seasonal dummies included. Forecast Combined using the median operator. MSE: 6.2011Copy the code

The function selects an autoregressive lag and applies dummy variables to seasonal trends. Use plot to show the architecture of the network (Figure 1).

 

Figure 1. The outputplot(mlp.fit).

Light red inputs represent binary dummy variables used to encode seasonality, while gray inputs are autoregressive hysteresis terms. To generate a forecast, you can type:

 forecast(mlp.fit,h=tst.n)
 
Copy the code

Figure 2 shows the overall prediction, as well as the predictions of the individual neural networks.

Figure 2.plotMLP forecast output.

You can also choose to hide the number of nodes.

MLP hd.auto. Type ="valid"Copy the code

This evaluates 1 to 10 hidden nodes and selects the best node on the validation set MSE. Cross validation can also be used. Output error:

MSE H.1 0.0083 H.2 0.0066 H.3 0.0065 H.5 0.0071 H.6 0.0074 H.7 0.0076 H.9 0.0083 H.10 0.0076 HCopy the code

ELM works in almost the same way.

# Fit ELM
elm.fit <- elm(y.in)
print(elm.fit)
plot(elm.fit)
Copy the code
)Copy the code

The following is the model summary:

ELM fit with 100 hidden nodes and 20 repetitions. Series modelled in differences: D1. Univariate lags: (4, 4,6,7,8,9,10,12) Deterministic seasonal dummies included. Forecast combined using the median operator. Output Weight Estimation using: Lasso.mse: 83.0044.Copy the code

In the network architecture in Figure 3, only nodes connected to the output layer with black lines are helpful for prediction. The remaining connection weights have been reduced to zero.

Figure 3. ELM network architecture.

The package implements hierarchical time prediction in R. You can perform the following operations:


forecastfunction=mlp.thief
Copy the code

Because I have reserved some test sets for this simple example, I compare the prediction to exponential smoothing:

 

METHOD MAE
MLP (5 nodes) 62.471
MLP (auto) 48.234
ELM 48.253
THieF-MLP 45.906
ETS 64.528

 

Time hierarchies like MAPA make your predictions more reliable and accurate. However, using neural networks can significantly increase computational costs!


Most welcome insight

1. Python for NLP: Classification using Keras’s multi-label text LSTM neural network

2. In Python, LSTM is used for time series predictive analysis — predicting power consumption data

3. Python uses LSTM in Keras to solve sequence problems

4.Python uses PyTorch machine learning classification to predict bank customer churn model

5.R language multivariate Copula GARCH model time series prediction

6. Use GAM (Generalized additive Model) in R language for power load time series analysis

7. ARMA, ARIMA (Box-Jenkins), SARIMA and ARIMAX models in R language are used to predict time series numbers

8. Empirical study on time series estimation of time-varying VAR model with R language

9. Time series analysis was carried out using the generalized additive model GAM