2020
DOI: 10.48550/arxiv.2008.08878
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Reinforcement Learning based dynamic weighing of Ensemble Models for Time Series Forecasting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…ERL can be divided into parallel ERL and sequential ERL according to the relationship between base learners in ERL. Figure 6 and Figure 7 give schematic diagrams of these two long short-term memory network, gated recurrent unit network Goyal et al [46] convolution neural network, gated recursive unit Liu et al [54] long short-term memory network, deep belief network, echo state network Perepu et al [55] a linear regression model, long short-term memory model, artificial neural network, random forest Liu et al [56] graph convolutional network, long short-term memory networks, gated recursive unit Saadallah et al [50] autoregressive integrated moving average, exponential smoothing, gradient boosting machines, gaussian processes, support vector regression, random forest, projection pursuit regression, MARS, principal component regression, decision tree regression, partial least squares regression, multilayer perceptron, long short-term memory network (LSTM), Bi-LSTM: Bidirectional LSTM, CNNbased LSTM, convolutional LSTM Daniel L. Elliott and Charles Anderson [57] convolution neural network, gated recursive unit, artificial neural network Shang et al [30] gated recursive unit, graph convolutional network, graph attention network Tan et al [31] graph attention network, long short-term memory networks, temporal convolutional network Li et al [58] gated recurrent unit, deep belief network, temporal convolutional network Zijie Cao and Hui Liu [59] temporal convolutional network, Bidirectional long short-term memory network, kernel extreme learning machine Birman et al [60] machine learning models, artificial neural network Li et al [51] naive bayes, support vector machine with stochastic gradient descent, FastText, Bi-directional long short-term memory Sharma et al [52] support vector regressor (SVR), eXtreme gradient boosting (XGBoost), random Forest (RF), artificial neural network (ANN), long short-term memory (LSTM), convolution neural network (CNN), CNN-LSTM, CNN-XGB, CNN-SVR, and CNN-RF Shi Yin and Hui Liu [61] group method of data handling, echo state network, extreme learning machine Yu et al [29] graph attention network, gated recursive unit, temporal convolutional network , that try to construct the ERL method in the sequential framework [64,65].…”
Section: Combination Of Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…ERL can be divided into parallel ERL and sequential ERL according to the relationship between base learners in ERL. Figure 6 and Figure 7 give schematic diagrams of these two long short-term memory network, gated recurrent unit network Goyal et al [46] convolution neural network, gated recursive unit Liu et al [54] long short-term memory network, deep belief network, echo state network Perepu et al [55] a linear regression model, long short-term memory model, artificial neural network, random forest Liu et al [56] graph convolutional network, long short-term memory networks, gated recursive unit Saadallah et al [50] autoregressive integrated moving average, exponential smoothing, gradient boosting machines, gaussian processes, support vector regression, random forest, projection pursuit regression, MARS, principal component regression, decision tree regression, partial least squares regression, multilayer perceptron, long short-term memory network (LSTM), Bi-LSTM: Bidirectional LSTM, CNNbased LSTM, convolutional LSTM Daniel L. Elliott and Charles Anderson [57] convolution neural network, gated recursive unit, artificial neural network Shang et al [30] gated recursive unit, graph convolutional network, graph attention network Tan et al [31] graph attention network, long short-term memory networks, temporal convolutional network Li et al [58] gated recurrent unit, deep belief network, temporal convolutional network Zijie Cao and Hui Liu [59] temporal convolutional network, Bidirectional long short-term memory network, kernel extreme learning machine Birman et al [60] machine learning models, artificial neural network Li et al [51] naive bayes, support vector machine with stochastic gradient descent, FastText, Bi-directional long short-term memory Sharma et al [52] support vector regressor (SVR), eXtreme gradient boosting (XGBoost), random Forest (RF), artificial neural network (ANN), long short-term memory (LSTM), convolution neural network (CNN), CNN-LSTM, CNN-XGB, CNN-SVR, and CNN-RF Shi Yin and Hui Liu [61] group method of data handling, echo state network, extreme learning machine Yu et al [29] graph attention network, gated recursive unit, temporal convolutional network , that try to construct the ERL method in the sequential framework [64,65].…”
Section: Combination Of Modelsmentioning
confidence: 99%
“…Weighted Aggregation: The prediction results obtained from different models are summed according to their weights [55]. A high value of weight is used for aggregation models with high prediction accuracy.…”
Section: Decision Strategiesmentioning
confidence: 99%
See 1 more Smart Citation