2015
DOI: 10.1109/tla.2015.7273790
|View full text |Cite
|
Sign up to set email alerts
|

Are neural networks able to forecast nonlinear time series with moving average components?

Abstract: In nonlinear time series forecasting, neural networks are interpreted as a nonlinear autoregressive models because they take as inputs the previous values of the time series. However, the use of neural networks to forecast nonlinear time series with moving components is an issue usually omitted in the literature. In this article, we investigate the use of traditional neural networks for forecasting nonlinear time series with moving average components and we demonstrate the necessity of formulating new neural n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 25 publications
0
3
0
Order By: Relevance
“…end for (20) Train the Rotation Forest in training set (21) for training set do (22) y � D (k) (F (k) ) (23) end for (24) Calculate y〈k〉 (25) for training and testing set do (26) y � D (k) (F (k) ) (27) end for (28) Calculate changes (29) s � MSE(y (k) , y (k) ) (30) until k > k m ax or s < eps (31) Construct F<k> (32) for j from 1 to k do (33) Calculate e (k) , F (k) , y (k) (34) end for (35)…”
Section: Discussion About Ma Termsmentioning
confidence: 99%
See 1 more Smart Citation
“…end for (20) Train the Rotation Forest in training set (21) for training set do (22) y � D (k) (F (k) ) (23) end for (24) Calculate y〈k〉 (25) for training and testing set do (26) y � D (k) (F (k) ) (27) end for (28) Calculate changes (29) s � MSE(y (k) , y (k) ) (30) until k > k m ax or s < eps (31) Construct F<k> (32) for j from 1 to k do (33) Calculate e (k) , F (k) , y (k) (34) end for (35)…”
Section: Discussion About Ma Termsmentioning
confidence: 99%
“…e main obstacle of such a combination of machine learning algorithms and the ARMA or ARIMA model is that the MA parts are basically the error terms between real and predicted observation, so it refers to the model result, which is still unknown when solving the model. us, the family of ARMA models is hard to be modeled by the ANN and other similar ML structures [32][33][34]. In the application of time series models to HI or RUL prediction, researchers mainly combine the ARMA family and ML methods as two independent parts, such as combining ARMA and SVM [35] and particle filter (PF) [23].…”
Section: Introductionmentioning
confidence: 99%
“…In order to adjust the weights and biases in neurons, BPNN employs error backpropagation operation. Benefiting from the gradient-descent feature, the algorithm has become an effective function approximation algorithm [20, 21]. A standard BPNN which consists of a number of m inputs and n outputs is shown in Figure 1.…”
Section: Algorithm Designmentioning
confidence: 99%