2017 IEEE International Conference on Data Science and Advanced Analytics (DSAA) 2017
DOI: 10.1109/dsaa.2017.26
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic and Heterogeneous Ensembles for Time Series Forecasting

Abstract: This paper addresses the issue of learning time series forecasting models in changing environments by leveraging the predictive power of ensemble methods. Concept drift adaptation is performed in an active manner, by dynamically combining base learners according to their recent performance using a non-linear function. Diversity in the ensembles is encouraged with several strategies that include heterogeneity among learners, sampling techniques and computation of summary statistics as extra predictors. Heteroge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
17
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 22 publications
(17 citation statements)
references
References 22 publications
(27 reference statements)
0
17
0
Order By: Relevance
“…12,13 Despite the popular practice of using identical type of models in an "ensemble" to handle chemical engineering problems, [14][15][16] the combination of models with essentially distinct structures (defined as "heterogeneous" here), seems to be more attractive. 17,18 As pointed out by Vinay et al, 19 the heterogeneous ensemble significantly improves model performance with regard to generalization ability. Therefore, in this investigation, nine base models mentioned in Ge et al 10 form a "pool" to be considered for introduction to the ensemble.…”
Section: Introductionmentioning
confidence: 87%
See 1 more Smart Citation
“…12,13 Despite the popular practice of using identical type of models in an "ensemble" to handle chemical engineering problems, [14][15][16] the combination of models with essentially distinct structures (defined as "heterogeneous" here), seems to be more attractive. 17,18 As pointed out by Vinay et al, 19 the heterogeneous ensemble significantly improves model performance with regard to generalization ability. Therefore, in this investigation, nine base models mentioned in Ge et al 10 form a "pool" to be considered for introduction to the ensemble.…”
Section: Introductionmentioning
confidence: 87%
“…Recent literature reports from different fields confirm that multi‐model ensemble could combine the advantages of base models and lead to better predictive performance . Despite the popular practice of using identical type of models in an “ensemble” to handle chemical engineering problems, the combination of models with essentially distinct structures (defined as “heterogeneous” here), seems to be more attractive . As pointed out by Vinay et al, the heterogeneous ensemble significantly improves model performance with regard to generalization ability.…”
Section: Introductionmentioning
confidence: 99%
“…7) Ensemble based on performance of most recent data points (recent ensemble): This technique, adapted from [24], selects and combines base models based on the performance on the most recent data points from the training dataset. The technique selects the best-performing λ% models and computes their weights by taking into account their performance on the P most recent points of the train dataset.…”
Section: Backwardmentioning
confidence: 99%
“…The parameters λ and P are therefore hyperparameters. Contrary to [24], we use the sAPE (sMAPE without mean) instead of the Squared Error to compute the performance score for scaling reasons, and do not update our train set over time.…”
Section: Backwardmentioning
confidence: 99%
“…The idea behind this approach is that the short-term future will be similar to recent past and earlier observations are not as relevant. Windowing approaches have been used to weight and combine the available models (Newbold & Granger, 1974;Cerqueira et al, 2017) (WL), or to select the recent best performing one (van Rijn et al, 2015) (BLAST).…”
Section: Windowing Approachesmentioning
confidence: 99%