2021
DOI: 10.21203/rs.3.rs-176698/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Label Noise Detection System Against Label Flipping Attack

Abstract: The label flipping attack is a special poisoning attack in the adversarial environment. By adding noise to the data, it destroys the learning process of the model and affects the decision-making performance of the model. Recent literature work uses semi-supervised learning techniques to defend against label flipping attacks. However, these methods require a clean dataset to achieve their goals. This study proposes a novel label noise processing framework to correct the labels of contaminated samples in the dat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 25 publications
0
0
0
Order By: Relevance