2019
DOI: 10.1609/aaai.v33i01.33019103
|View full text |Cite
|
Sign up to set email alerts
|

Safeguarded Dynamic Label Regression for Noisy Supervision

Abstract: Learning with noisy labels is imperative in the Big Data era since it reduces expensive labor on accurate annotations. Previous method, learning with noise transition, has enjoyed theoretical guarantees when it is applied to the scenario with the class-conditional noise. However, this approach critically depends on an accurate pre-estimated noise transition, which is usually impractical. Subsequent improvement adapts the preestimation in the form of a Softmax layer along with the training progress. However, th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
48
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 55 publications
(49 citation statements)
references
References 11 publications
0
48
0
Order By: Relevance
“…Loss correction approaches [16], [18], [24], [25] either modify the loss directly or the network probabilities to compensate for the incorrect guidance provided by the noisy samples. [25] extend the loss with a perceptual term that introduces a reliance on the model prediction.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…Loss correction approaches [16], [18], [24], [25] either modify the loss directly or the network probabilities to compensate for the incorrect guidance provided by the noisy samples. [25] extend the loss with a perceptual term that introduces a reliance on the model prediction.…”
Section: Related Workmentioning
confidence: 99%
“…Patrini et al [24] estimate the label noise transition matrix T , which specifies the probability of one label being flipped to another, and correct the softmax probability by multiplying by T . In the same spirit, Yao et al [18] propose to estimate T in a Bayesian non-parametric form and deduce a dynamic label regression method to train the classifier and model the noise.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Some approaches [1,2,3,4,5,6] deal with the noisylabel issue or labeling biases from crowd-sourced annotations. Many advances [7,8,9,10] have been made in recent years, especially for classification tasks with crowd-sourced labels. For regression tasks, except for the inconsistency among multiple annotators, the intra-annotator consistency also needs to be considered, in which the labeled rankings of samples from the same annotator usually align well with the rankings of the latent ground truth scores.…”
Section: Introductionmentioning
confidence: 99%