2016
DOI: 10.1016/j.knosys.2016.03.024
|View full text |Cite
|
Sign up to set email alerts
|

A robust multi-class AdaBoost algorithm for mislabeled noisy data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
39
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 71 publications
(39 citation statements)
references
References 24 publications
0
39
0
Order By: Relevance
“…For this study, random forest (RF) and AdaBoost (AB) classifiers were used, due to their popularity and representativeness for respective groups of algorithms, based on bagging or boosting approaches respectively. In addition, the selected classifiers in their resistance to the presence of noisy observation data, with the RF classifier considered as one of the most resistant algorithms [3,9,27,28].…”
Section: Classification and Validationmentioning
confidence: 99%
See 2 more Smart Citations
“…For this study, random forest (RF) and AdaBoost (AB) classifiers were used, due to their popularity and representativeness for respective groups of algorithms, based on bagging or boosting approaches respectively. In addition, the selected classifiers in their resistance to the presence of noisy observation data, with the RF classifier considered as one of the most resistant algorithms [3,9,27,28].…”
Section: Classification and Validationmentioning
confidence: 99%
“…The RF algorithm is commonly considered as largely resistant to noisy and mislabeled training data [3,9,27,28,42]. According to some studies, even features containing noise levels as high as 30% result in a decrease in Kappa accuracy in the range of 10%, which is considered a very moderate decrease [43].…”
Section: Impact Of the On-ground Reference Dataset Modification On Thmentioning
confidence: 99%
See 1 more Smart Citation
“…The AdaBoost (adaptive boosting) algorithm [32,33] is one of the best-known Boost algorithms and has the advantage of being simple and efficient. The use of the AdaBoost learning algorithm makes it possible to acquire more feature values that better express the target object as learning progresses, so that an accurate and robust recognition algorithm can be created.…”
Section: Filtering Candidate Areas Using Learning Generallymentioning
confidence: 99%
“…Sun et al, 2016 [10] quoted a representative approach named noise-detection based AdaBoost (ND_AdaBoost) in order to improve the robustness of AdaBoost in the two-class classification scenario. In order to resolve the dilemma a robust multi-class AdaBoost algorithm (Rob_MulAda) is proposed by the authors whose key ingredients consist in a noise-detection based multi-class loss function and a new weight updating scheme.…”
Section: Recent Work On Adaboostmentioning
confidence: 99%