2021
DOI: 10.48550/arxiv.2103.07756
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Learning with Feature-Dependent Label Noise: A Progressive Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 21 publications
0
7
0
Order By: Relevance
“…Robust learning algorithms include robust networks [2,14,15], robust loss functions [16][17][18][19], and robust regularization [8,32]. Noise detection algorithms include sample selection [20][21][22][23] and pseudolabeling [33][34][35][36][37]. ELR+ [24], DivideMix [25], and LongReMix [38] which combine several LNL and SSL techniques, are state-of-the-art methods in the field of LNL.…”
Section: Learning With Noisy Labelsmentioning
confidence: 99%
“…Robust learning algorithms include robust networks [2,14,15], robust loss functions [16][17][18][19], and robust regularization [8,32]. Noise detection algorithms include sample selection [20][21][22][23] and pseudolabeling [33][34][35][36][37]. ELR+ [24], DivideMix [25], and LongReMix [38] which combine several LNL and SSL techniques, are state-of-the-art methods in the field of LNL.…”
Section: Learning With Noisy Labelsmentioning
confidence: 99%
“…In this section, we compared our method to most recent state-of-the-art methods: DivideMix (Li et al, 2020a), LossModelling (Arazo et al, 2019), Coteaching+ (Yu et al, 2019), Mixup (Zhang et al, 2017), F-correction (Patrini et al, 2017), SELFIE (Song et al, 2019), PLC (Zhang et al, 2021), PENCIL (Yi and Wu, 2019), ELR (Liu et al, 2020a), NCT , MOIT+ (Ortego et al, 2021), NGC , RRL (Li et al, 2020b), FaMUS , GJS (Ghosh and Lan, 2021), PDLC (Liu et al, 2020b). We show, that the proposed method achieves consistent improvements in all datasets and at all noise types and ratios.…”
Section: Synthetic Noisy Datasets Evaluationmentioning
confidence: 99%
“…The experiment results show that the iterative label cleaning strategy achieves the best performance. Following this conclusion, we utilize a state-of-theart cleaning training method [65] that iteratively corrects labels with predicted probabilities above a decreasing threshold.…”
Section: Learning From Noisy Labelsmentioning
confidence: 99%