2021
DOI: 10.48550/arxiv.2109.14422
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multi-class Probabilistic Bounds for Self-learning

Abstract: Self-learning is a classical approach for learning with both labeled and unlabeled observations which consists in giving pseudo-labels to unlabeled training instances with a confidence score over a predetermined threshold. At the same time, the pseudo-labeling technique is prone to error and runs the risk of adding noisy labels into unlabeled training data. In this paper, we present a probabilistic framework for analyzing self-learning in the multi-class classification scenario with partially labeled data. Fir… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 24 publications
(45 reference statements)
0
1
0
Order By: Relevance
“…A theoretical perspective for future work is to study the self-training method by taking explicitly into account the fact that the final classifier is learned on the training data with noisy labels. Some attempts have been already initiated to model this label noise using the Massart noise [Hadjadj et al, 2021] or the mislabeling transition matrix [Feofanov et al, 2021], but there is a clear need for more research in this direction.…”
Section: Discussionmentioning
confidence: 99%
“…A theoretical perspective for future work is to study the self-training method by taking explicitly into account the fact that the final classifier is learned on the training data with noisy labels. Some attempts have been already initiated to model this label noise using the Massart noise [Hadjadj et al, 2021] or the mislabeling transition matrix [Feofanov et al, 2021], but there is a clear need for more research in this direction.…”
Section: Discussionmentioning
confidence: 99%