2022
DOI: 10.1002/mp.15799
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian statistics‐guided label refurbishment mechanism: Mitigating label noise in medical image classification

Abstract: Purpose Deep neural networks (DNNs) have been widely applied in medical image classification, benefiting from its powerful mapping capability among medical images. However, these existing deep learning‐based methods depend on an enormous amount of carefully labeled images. Meanwhile, noise is inevitably introduced in the labeling process, degrading the performance of models. Hence, it is significant to devise robust training strategies to mitigate label noise in the medical image classification tasks. Methods … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 43 publications
0
3
0
Order By: Relevance
“…Notably, the noise rate was set as 8% for sample-selection style methods. Meanwhile, we report some performance of other popular anti-noise methods based on the ANIMAL10N given in original papers [13], [29]. We conducted our method on different backbones, including ResNet-18 and VGG-19.…”
Section: Experimental Results On Real-world Noisy Datasetmentioning
confidence: 99%
“…Notably, the noise rate was set as 8% for sample-selection style methods. Meanwhile, we report some performance of other popular anti-noise methods based on the ANIMAL10N given in original papers [13], [29]. We conducted our method on different backbones, including ResNet-18 and VGG-19.…”
Section: Experimental Results On Real-world Noisy Datasetmentioning
confidence: 99%
“…• Label smoothing and refurbishment: the former transforms the hard label y (e.g., one-hot encoded for a K-class classification) to a soft target y s acting as a regularization technique [288]: y s = (1 − α)y + α K , where α parameter modulates the level of confidence during training thus avoiding over-fitting predictions. The latter consists in replacing the original given noisy label with a refurbished one; Gao et al [289] adopted a plug-and-play additional module with Bayesian statistic and a time-weighting module for optimal label selection.…”
Section: B Improving the Generalization Ability Of Models 1) Tackling...mentioning
confidence: 99%
“…Studies often focus on traditional ensemble methods [14], but the advantages of strategically analyzing predicted probabilities using a SVM meta-learner warrant further investigation. Meta-learning frameworks have shown the ability to surpass the accuracy of single-model or ensemble approaches in other domains by learning patterns of model performance across diverse tasks [29, 30]. Applying this concept to CVD risk prediction holds the potential to significantly improve accuracy, as individual models often demonstrate varying reliability when dealing with complex patient risk factors [31].…”
Section: Introductionmentioning
confidence: 99%