Proceedings of the 2023 6th International Conference on Machine Vision and Applications 2023
DOI: 10.1145/3589572.3589596
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Noisy Label Learning Method with Semi-supervised Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 2 publications
0
1
0
Order By: Relevance
“…Scaling up a dataset for pretraining has advanced in deep learning communities while suggesting a new horizon for zero-shot and few-shot transfer. The recent success of pretrained language models, such as bidirectional encoder representations from transformers (BERT; Devlin et al, 2018) and generative pretrained transformer (GPT) series (Brown et al, 2020;Radford et al, 2018Radford et al, , 2019, inspired researchers to extend these models to perform visionrelated tasks (Chen et al, 2020;Kim et al, 2021;L. H. Li et al, 2019;X.…”
Section: Clipmentioning
confidence: 99%
“…Scaling up a dataset for pretraining has advanced in deep learning communities while suggesting a new horizon for zero-shot and few-shot transfer. The recent success of pretrained language models, such as bidirectional encoder representations from transformers (BERT; Devlin et al, 2018) and generative pretrained transformer (GPT) series (Brown et al, 2020;Radford et al, 2018Radford et al, , 2019, inspired researchers to extend these models to perform visionrelated tasks (Chen et al, 2020;Kim et al, 2021;L. H. Li et al, 2019;X.…”
Section: Clipmentioning
confidence: 99%