2019 IEEE/CVF International Conference on Computer Vision (ICCV) 2019
DOI: 10.1109/iccv.2019.00019
|View full text |Cite
|
Sign up to set email alerts
|

NLNL: Negative Learning for Noisy Labels

Abstract: Convolutional Neural Networks (CNNs) provide excellent performance when used for image classification. The classical method of training CNNs is by labeling images in a supervised manner as in "input image belongs to this label" (Positive Learning; PL), which is a fast and accurate method if the labels are assigned correctly to all images. However, if inaccurate labels, or noisy labels, exist, training with PL will provide wrong information, thus severely degrading performance. To address this issue, we start w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
192
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 219 publications
(194 citation statements)
references
References 53 publications
2
192
0
Order By: Relevance
“…Kim et al [27] propose the use of the learning method called Negative Learning (NL), where it is used as a complementary label (i.e., a label that is different from the annotation). By using complementary labels, the chances of selecting a true label as a complementary are low, and NL decreases the risk of providing incorrect information.…”
Section: Mixup [55] Is a Technique Proposed For Data Augmentation Thmentioning
confidence: 99%
“…Kim et al [27] propose the use of the learning method called Negative Learning (NL), where it is used as a complementary label (i.e., a label that is different from the annotation). By using complementary labels, the chances of selecting a true label as a complementary are low, and NL decreases the risk of providing incorrect information.…”
Section: Mixup [55] Is a Technique Proposed For Data Augmentation Thmentioning
confidence: 99%
“…Northcutt et al [14] proposed confident learning for characterizing, identifying, and learning with noisy labels. Kim et al [15] proposed Selective Negative Learning and Positive Learning (SelNLPL) to filter and learn with noisy data. These methods face the problem of discriminating difficulty from mismatched labels.…”
Section: Learning From Noisy Labelsmentioning
confidence: 99%
“…Existing methods based on RGB images with noisy labels usually make a strong assumption that all labels are noisy. These studies mostly work on robust algorithms against noisy labels [13], label cleansing methods finding label errors [14] , or combining them together [15]. It was proven that these classifiers have achieved good accuracy on noisy CIFAR10/100 datasets.…”
Section: Introductionmentioning
confidence: 99%
“…Inspired by the benefit of complementary learning [54], we propose to combine the strength of the three loss functions in Eq. (7), Eq.…”
Section: ) Final Loss Functionmentioning
confidence: 99%