2016
DOI: 10.1109/tnnls.2015.2475750
|View full text |Cite
|
Sign up to set email alerts
|

RBoost: Label Noise-Robust Boosting Algorithm Based on a Nonconvex Loss Function and the Numerically Stable Base Learners

Abstract: AdaBoost has attracted much attention in the machine learning community because of its excellent performance in combining weak classifiers into strong classifiers. However, AdaBoost tends to overfit to the noisy data in many applications. Accordingly, improving the antinoise ability of AdaBoost plays an important role in many applications. The sensitiveness to the noisy data of AdaBoost stems from the exponential loss function, which puts unrestricted penalties to the misclassified samples with very large marg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
39
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 82 publications
(41 citation statements)
references
References 31 publications
(36 reference statements)
0
39
0
Order By: Relevance
“…However, fixed number of NNs may cause over-fitting and/or under-fitting phenomenon [7] which is similar to other typical learning issues (e.g. Boosting [8]). …”
Section: Introductionmentioning
confidence: 89%
See 1 more Smart Citation
“…However, fixed number of NNs may cause over-fitting and/or under-fitting phenomenon [7] which is similar to other typical learning issues (e.g. Boosting [8]). …”
Section: Introductionmentioning
confidence: 89%
“…CDL models the dictionary learning problem as a bilevel optimization problem to minimize the squared loss term on both feature space, namely, one of the optimization problems on the observation feature space and the other on the latent feature space. To balance the reconstruction error on observation feature space and latent feature space, CDL algorithm changes the objective function of [8]:…”
Section: Coupled Dictionary Learningmentioning
confidence: 99%
“…On the other hand, unbounded increment of penalty value reveals the over…tting problem. Therefore, bounded loss functions and its boosting algorithms have been proposed in the few years [13,14]. TangentBoost is an alternative loss function and the method has bounded loss function.…”
Section: Boosting Algorithms In Binary Classificationmentioning
confidence: 99%
“…As a result, classi…ers can be improper and their generalization ability may not be good. To make classi…ers more stable, some researchers proposed robust boosting algorithms [13,14,16,17,18]. The idea behind TangentBoost algorithm is probability elicitation and conditional risk minimization [19].…”
Section: Tangentboost and The Correctionmentioning
confidence: 99%
“…After training step, a license plate test image located in a natural scene is feed to the system. Then a strong classifier is trained by the adaboost algorithm [7] and is used to classify parts of an image within a search window as either license plate or non-license plate. The original adaboost algorithm is simple and fast; however, it fails to detect license plate when the range of variations for distance or viewing angle increases.…”
Section: Related Workmentioning
confidence: 99%