2016
DOI: 10.1007/s10994-016-5572-x
|View full text |Cite
|
Sign up to set email alerts
|

Cost-sensitive boosting algorithms: Do we really need them?

Abstract: We provide a unifying perspective for two decades of work on cost-sensitive Boosting algorithms. When analyzing the literature 1997-2016, we find 15 distinct costsensitive variants of the original algorithm; each of these has its own motivation and claims to superiority-so who should we believe? In this work we critique the Boosting literature using four theoretical frameworks: Bayesian decision theory, the functional gradient descent view, margin theory, and probabilistic modelling. Our finding is that only t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
54
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 61 publications
(65 citation statements)
references
References 23 publications
2
54
0
Order By: Relevance
“…It also remains an option available in toolkits such as Weka, Scikit learn, and R, so the results should be of interest to practitioners. The original TAN learning algorithm in order to assess the extent to which EBNO makes a difference in comparison to an algorithm that focuses on maximizing accuracy of TANs. Use of two boosting algorithms, AdaBoost to induce cost‐sensitive TANs and use of XGBoost to induce cost‐sensitive decision trees. Although AdaBoost is one of the earliest boosting algorithms, as the study by Nikolaou et al () concludes, it remains an important method for boosting and cost‐sensitive learning. XGBoost is included primarily because it is a more recent innovation and has resulted in some of the best results in Kaggle competitions.…”
Section: Empirical Evaluation and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…It also remains an option available in toolkits such as Weka, Scikit learn, and R, so the results should be of interest to practitioners. The original TAN learning algorithm in order to assess the extent to which EBNO makes a difference in comparison to an algorithm that focuses on maximizing accuracy of TANs. Use of two boosting algorithms, AdaBoost to induce cost‐sensitive TANs and use of XGBoost to induce cost‐sensitive decision trees. Although AdaBoost is one of the earliest boosting algorithms, as the study by Nikolaou et al () concludes, it remains an important method for boosting and cost‐sensitive learning. XGBoost is included primarily because it is a more recent innovation and has resulted in some of the best results in Kaggle competitions.…”
Section: Empirical Evaluation and Resultsmentioning
confidence: 99%
“…However, in a more recent paper, Cost‐sensitive boosting algorithms: Do we really need them? , Nikolaou, Narayanan, Kull, Flach, and Brown () present a critique of cost‐sensitive boosting algorithms from multiple perspectives and conclude that Adaboost performs just as well as other variations of boosting algorithms.…”
Section: Background On Cost‐sensitive Learningmentioning
confidence: 99%
“…The way to transform the output of a boosting model into a probability (the calibration process) plays a key role in the performance of the predictive algorithm. It has been shown that we can achieve at least the same performance with well calibrated boosting model than with one which is cost sensitive [14]. However, we think that our cost sensitive approach gives us a good transformation of the output of the model into a probability.…”
Section: Cost Sensitive Loss For Gradient Boostingmentioning
confidence: 91%
“…Additional adaptive boosting algorithm variants like TotalBoost [34], BrownBoost [12], LPBoost [7] and SmoothBoost [33] and those in [8] will not be considered in this tutorial, as well as costsensitive boosting in asymmetric statistical learning problems ( [25], [24] and references therein, in particular those presented in Table 3.1, Chapter 3).…”
Section: On Adaboost: An Excursus In the Literaturementioning
confidence: 99%
“…After a description of AdaBoostClassifier(), we turn briefly our attention to a specific gradient boosting software implementation. In Python, gradient boosting is implemented in the sklearn library by the GradientBoostingClassifier() and GradientBoostingRegressor() functions 25 ; both functions use trees as base learners. Let us focus on classification problems only.…”
Section: Boosting Algorithms: Gradientboostmentioning
confidence: 99%