Proceedings of the 24th International Conference on Machine Learning 2007
DOI: 10.1145/1273496.1273573
|View full text |Cite
|
Sign up to set email alerts
|

Asymmetric boosting

Abstract: A cost-sensitive extension of boosting, denoted as asymmetric boosting, is presented. Unlike previous proposals, the new algorithm is derived from sound decision-theoretic principles, which exploit the statistical interpretation of boosting to determine a principled extension of the boosting loss. Similarly to AdaBoost, the cost-sensitive extension minimizes this loss by gradient descent on the functional space of convex combinations of weak learners, and produces large margin detectors. It is shown that asymm… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
68
0
1

Year Published

2007
2007
2015
2015

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 72 publications
(69 citation statements)
references
References 14 publications
0
68
0
1
Order By: Relevance
“…As for the other variants, they are all methods that modify the training algorithm. CSB0 and CSB1 [17] do not use confidence rated predictions and based on the results of comparative studies [9,10,15], the two variants are typically dominated by CSB2. Asymmetric-Adaboost [18] was excluded from said studies as being similar to CSB2.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…As for the other variants, they are all methods that modify the training algorithm. CSB0 and CSB1 [17] do not use confidence rated predictions and based on the results of comparative studies [9,10,15], the two variants are typically dominated by CSB2. Asymmetric-Adaboost [18] was excluded from said studies as being similar to CSB2.…”
Section: Discussionmentioning
confidence: 99%
“…However it is often regarded as skew-insensitive [15,17], meaning it is unable to handle asymmetric tasks. There exist many skew-sensitive AdaBoost variants, including AdaCost [2,17], CSB0, CSB1, CSB2 [17], Asymmetric-Adaboost [18], RareBoost [6], AdaC1, AdaC2, AdaC3 [16], CS-AdaBoost [9,10]. However, most of them are heuristic and as a result they lack the theoretical guarantees of the original AdaBoost [7].…”
Section: Introductionmentioning
confidence: 99%
“…In [112], the above work was extended by proposing to balance the skewness of labels presented to each weak classifiers, so that they are trained more equally. In [113] a more rigorous form of asymmetric boosting based on the statistical interpretation of boosting [55] with an extension of the boosting loss, was proposed. Namely, the exponential cost criterion in (3) is rewritten as:…”
Section: Variations Of the Boosting Learning Algorithmmentioning
confidence: 99%
“…The method in [113] minimized the above criterion following the AnyBoost framework in [114]. The method wes able to build a detector with very high detection rate [115], though the performance of the detector deteriorates very quickly when the required false positive rate is low.…”
Section: Variations Of the Boosting Learning Algorithmmentioning
confidence: 99%
“…where I(y = 1) and I(y = −1) are indicator functions [11]. Hence, (23) is an asymmetric boosting loss function that can be minimized in a manner similar to AdaBoost, by gradient descent on the space of convex combinations of weak learners.…”
Section: A Cost-sensitive Boosting Algorithmmentioning
confidence: 99%