2020
DOI: 10.1109/access.2020.3041873
|View full text |Cite
|
Sign up to set email alerts
|

DCBT-Net: Training Deep Convolutional Neural Networks With Extremely Noisy Labels

Abstract: Obtaining data with correct labels is crucial to attain the state-of-the-art performance of Convolutional Neural Network (CNN) models. However, labeling datasets is significantly time-consuming and expensive process because it requires expert knowledge in a particular domain. Therefore, real-life datasets often exhibit incorrect labels due to the involvement of nonexperts in the data-labeling process. Consequently, there are many cases of incorrectly labeled data in the wild. Although the issue of poorly label… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 18 publications
0
6
0
Order By: Relevance
“…In this research, we apply the Graph Laplacian-based sample weighting method to prevent noisy labels. First, inspired by Olimov et al [17], we applied the Graph Laplacian method to find potential noisy labels for each class of inputs. However, unlike [17], instead of deleting all noisy data from the training sets, we assigned different sample weights to those detected noisy data.…”
Section: B Data Preprocessing: Graph Laplacian-based Sample Weightsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this research, we apply the Graph Laplacian-based sample weighting method to prevent noisy labels. First, inspired by Olimov et al [17], we applied the Graph Laplacian method to find potential noisy labels for each class of inputs. However, unlike [17], instead of deleting all noisy data from the training sets, we assigned different sample weights to those detected noisy data.…”
Section: B Data Preprocessing: Graph Laplacian-based Sample Weightsmentioning
confidence: 99%
“…First, inspired by Olimov et al [17], we applied the Graph Laplacian method to find potential noisy labels for each class of inputs. However, unlike [17], instead of deleting all noisy data from the training sets, we assigned different sample weights to those detected noisy data. (ii) Similarity Matrix: A similarity matrix (š‘†) is a square matrix that describes pairwise similarity or dissimilarity between two nodes.…”
Section: B Data Preprocessing: Graph Laplacian-based Sample Weightsmentioning
confidence: 99%
“…Deep learning models have shown competitive performance compared to dermatologists demonstrated in Reference 14. In this regard, CNN 15,16 quickly becomes a choice in examining dermoscopic images. [17][18][19][20][21][22][23][24][25] For better classification performance, accurate lesion area extraction is very important.…”
Section: Related Workmentioning
confidence: 99%
“…Image classification involves the extraction of useful features from a digital image and the classification of the image into one of the pre-defined classes based on the extracted features 12 , 13 . Manual verification and classification of digital images can be a laborious and monotonous process; thus, automating the image analysis process by using image classification methods is more efficient and less time-consuming 14 , 15 . Recent advances in these methods have facilitated the usage of image classification in several real-world applications, such as medical imaging 16 , 17 , face recognition 18 , human activity recognition 19 , and traffic control systems 20 , 21 .…”
Section: Introductionmentioning
confidence: 99%