2014 IEEE Symposium on Computational Intelligence in Ensemble Learning (CIEL) 2014
DOI: 10.1109/ciel.2014.7015739
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble deep learning for regression and time series forecasting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
157
0
2

Year Published

2017
2017
2021
2021

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 270 publications
(167 citation statements)
references
References 18 publications
0
157
0
2
Order By: Relevance
“…Little work has been done in the field of time series regression using DNNs, and almost no work has been done in the field of energy forecasting with DNNs. One notable example of literature on these subjects is Qui et al, who claim to be the first to use DNNs for regression and time series forecasting [13]. They show promising results on three electric load demand time series and several other time series using 20 DNNs ensembled with support vector regression.…”
Section: Prior Workmentioning
confidence: 99%
“…Little work has been done in the field of time series regression using DNNs, and almost no work has been done in the field of energy forecasting with DNNs. One notable example of literature on these subjects is Qui et al, who claim to be the first to use DNNs for regression and time series forecasting [13]. They show promising results on three electric load demand time series and several other time series using 20 DNNs ensembled with support vector regression.…”
Section: Prior Workmentioning
confidence: 99%
“…Deep-learning methods are machine learning algorithms [10] with multiple levels of representation are representation-learning methods, which are obtained by composing simple modules that each transforms the representation at one level into a representation at a higher, slightly more abstract level [11]. There are supervised learning algorithms namely recurrent network [12], convolutional neural network [13] and multilayer perceptron [14].…”
Section: Related Workmentioning
confidence: 99%
“…In recent years, recurrent neural networks (RNN) have achieved remarkable success in a range of sequence modeling tasks (Lipton et al, 2015;Kuremoto et al, 2014;Qiu et al, 2014). Inspired by the success of recurrent neural networks with pre-trained word embeddings for text modeling, we use a stack of RNN layers for encoding the textual content of a post.…”
Section: Introductionmentioning
confidence: 99%