2021
DOI: 10.1016/j.trc.2021.103357
|View full text |Cite
|
Sign up to set email alerts
|

The ensemble approach to forecasting: A review and synthesis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
21
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 43 publications
(22 citation statements)
references
References 51 publications
1
21
0
Order By: Relevance
“…Such an analysis will demonstrate if and to what extent performance varies not only across forecast dates, horizons, and locations, but also across seasons. The results of this analysis may help forecasters and hub coordinators understand sensitivity of contributions to seasonal dynamics, and to reinforce the utility of consortium ensemble approaches that have previously been shown to improve forecasting accuracy (Wang et al 2022; Wu and Levinson 2021).…”
Section: Introductionmentioning
confidence: 63%
“…Such an analysis will demonstrate if and to what extent performance varies not only across forecast dates, horizons, and locations, but also across seasons. The results of this analysis may help forecasters and hub coordinators understand sensitivity of contributions to seasonal dynamics, and to reinforce the utility of consortium ensemble approaches that have previously been shown to improve forecasting accuracy (Wang et al 2022; Wu and Levinson 2021).…”
Section: Introductionmentioning
confidence: 63%
“…In addition, the ensemble model, by collective decision mechanism, focuses on synthesizing information from several sub-models with different structures and has been shown to reduce average error and combine the strengths of models in the exploration of diverse data patterns [52][53][54]. However, the addition of a poorly performing model will not reduce the overall model classification skill, because the ensemble model has a net gain compared to the single models [55,56]. Given the above, the ensemble model can reduce the risk of relying on a single prediction distribution and extract richer semantic feature information than the single CNN models (such as each sub-model in the training process has a different probability for boundary regions in pixel-level), which are beneficial in classification tasks to or the achievement of better performance to improve classification accuracy [57][58][59][60][61][62].…”
Section: Discussionmentioning
confidence: 99%
“…Heterogeneous ensemble learning combines multiple different base models to obtain higher predictive performance than any of the constituent base models alone. One popular way to combine base models is a stacking method (or “super learner”) (Wu and Levinson, 2021; Zhang and Ma, 2012). The steps are as follows.…”
Section: Predictive Modelsmentioning
confidence: 99%
“…While bagging trains base models independently from each other in parallel (Bühlmann and Yu, 2002), boosting does them one-by-one sequentially in an adaptive way, where successive models are trained to predict the remaining error from the previous model (Zhang and Ma, 2012). On the other hand, heterogeneous ensemble uses stacking method (also known as "super learner" method), where a meta-learner is trained on how to combine the output of the heterogeneous base models (Wu and Levinson, 2021). The stacking method differs from the weighted average methods, in that weights for different base models do not have to add up to one.…”
Section: Visuals and Speechmentioning
confidence: 99%