2014
DOI: 10.1134/s0005117914050105
|View full text |Cite
|
Sign up to set email alerts
|

Forecasting nonstationary time series based on Hilbert-Huang transform and machine learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
16
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(16 citation statements)
references
References 19 publications
0
16
0
Order By: Relevance
“…Using a seasonal neural network approach for 72-h-ahead forecasting [22] presented a model with an overall RMSE of 1.46 for the period, which is more than four times greater than the simple linear model, but the forecasting period is also four times shorter than the forecasting period for the simple linear model. The artificial neural network approach provides a one-hour-ahead forecast with a MAE of 1.44 and an RMSE of 1.92 [17]. The simple linear model provides an in-sample MAE of −0.12 and out-of-sample MAE of −4.04.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Using a seasonal neural network approach for 72-h-ahead forecasting [22] presented a model with an overall RMSE of 1.46 for the period, which is more than four times greater than the simple linear model, but the forecasting period is also four times shorter than the forecasting period for the simple linear model. The artificial neural network approach provides a one-hour-ahead forecast with a MAE of 1.44 and an RMSE of 1.92 [17]. The simple linear model provides an in-sample MAE of −0.12 and out-of-sample MAE of −4.04.…”
Section: Discussionmentioning
confidence: 99%
“…The rapid growth in power production from renewable energy sources has increasingly urged producers to sell their electricity generation through the market, due to renewable intermittency. neural network performs best with a MAPE of 3.12%, a mean average error (MAE) of 1.44 and a root mean squared error (RMSE) of 1.92 [17]. This is generally 0.5-1% better than the support vector machine approach [20,21].…”
Section: Introductionmentioning
confidence: 97%
“…ANN process the information like the neural network of the human brain, i.e., by connecting different simple units/nodes, named as artificial neurons, to form different complex networks [19]. Each node includes an activation function to produce an output value based on one or multiple inputs.…”
Section: Annmentioning
confidence: 99%
“…ANN has risen in importance since the 1980s and has been a research hotspot in the field of artificial intelligence. ANN has widespread applications in terms of data processing, classification, regression, function approximation, and numerical control.ANN process the information like the neural network of the human brain, i.e., by connecting different simple units/nodes, named as artificial neurons, to form different complex networks [19]. Each node includes an activation function to produce an output value based on one or multiple inputs.…”
mentioning
confidence: 99%
“…In this research, we introduced the method of EOFs to the MRD with the aid of the Hilbert transform. In the area of signal processing and machine-learning, the Hilbert transform (with empirical mode decomposition) was used to capture the dynamic property of time-series and to generate feature variables for prediction models [ 16 ]. It is thus expected that the proposed portfolio construction will outperform conventional portfolio constructions by including the dynamic properties of the risks, which are derived from the Hilbert transform of the returns of the assets.…”
Section: Introductionmentioning
confidence: 99%