2002
DOI: 10.1016/s0167-9473(01)00068-8
|View full text |Cite
|
Sign up to set email alerts
|

Improving nonparametric regression methods by bagging and boosting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2004
2004
2016
2016

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 54 publications
(27 citation statements)
references
References 10 publications
0
27
0
Order By: Relevance
“…The fundamental reason for the effectiveness of bagging unstable models is that the predictive capability of the regression models is based on the bias/variance trade-off, and the variance of the aggregated predictor is reduced and close-to-constant bias is maintained [52].…”
Section: Bagging and Subagging For Non-linear Multivariate Calibrationmentioning
confidence: 99%
“…The fundamental reason for the effectiveness of bagging unstable models is that the predictive capability of the regression models is based on the bias/variance trade-off, and the variance of the aggregated predictor is reduced and close-to-constant bias is maintained [52].…”
Section: Bagging and Subagging For Non-linear Multivariate Calibrationmentioning
confidence: 99%
“…It has been observed that this simple averaging rule can significantly improve the model accuracy and robustness in various applications [4,6,22,23].…”
Section: Combination Using Simple Averaging Rulementioning
confidence: 99%
“…[3,5,17,26]. It has, however, also been successfully applied to regressions such as projection pursuit regression (PPR), multivariate adaptive regression splines (MARS), local learning based on recursive covering (DART), support vector machines (SVM), regression trees and and linear regression with forward variable selection [2,4,5,24].…”
Section: Baggingmentioning
confidence: 99%
“…They can lead to increased accuracy of the predictors and also better stability. In particular, ensemble methods have good properties when used in connection with regression and classification trees [3,5,16], but they have also given promising results for other methods [4,5,17,24,26].…”
Section: Introductionmentioning
confidence: 99%