2006
DOI: 10.1162/neco.2006.18.7.1678
|View full text |Cite
|
Sign up to set email alerts
|

Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression

Abstract: The application of boosting technique to regression problems has received relatively little attention in contrast to research aimed at classification problems. This letter describes a new boosting algorithm, AdaBoost.RT, for regression problems. Its idea is in filtering out the examples with the relative estimation error that is higher than the preset threshold value, and then following the AdaBoost procedure. Thus, it requires selecting the suboptimal value of the error threshold to demarcate examples as poor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
115
0

Year Published

2010
2010
2023
2023

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 189 publications
(116 citation statements)
references
References 18 publications
1
115
0
Order By: Relevance
“…Hastie et al (2009) apply it to classi…cation, and argue that "boosting" is one of the most powerful learning algorithms currently available. The method has been extended to regression problems in Ridgeway et al (1999) and Shrestha and Solomatine (2006). In the economics literature, Bai and Ng (2009) use a boosting for selecting the predictors in factor augmented autoregressions.…”
Section: Robust Estimation Techniquesmentioning
confidence: 99%
“…Hastie et al (2009) apply it to classi…cation, and argue that "boosting" is one of the most powerful learning algorithms currently available. The method has been extended to regression problems in Ridgeway et al (1999) and Shrestha and Solomatine (2006). In the economics literature, Bai and Ng (2009) use a boosting for selecting the predictors in factor augmented autoregressions.…”
Section: Robust Estimation Techniquesmentioning
confidence: 99%
“…The Adaboost paradigm has been also considered for regression problem. In [22] the authors present two algorithms denoted AdaBoost.R2 and AdaBoost.RT. In AdaBoost.R2, all weights are updated at each iteration while in AdaBoost.RT the weights are updated according to some threshold, in such a way to emphasis on the only examples hard to classify.…”
Section: Adaboostmentioning
confidence: 99%
“…This is accomplished by finding a linear combination of weak learning algorithms in order to minimize the total loss over a set of training data commonly using a functional gradient descent [19,20]. Boosting is successfully applied to several different problems in the machine learning literature including classification [1,20,21], regression [19,21,22], and prediction [23,24]. However, significantly less attention is given to the idea of boosting in online regression framework.…”
Section: Introductionmentioning
confidence: 99%