2022
DOI: 10.1145/3485473
|View full text |Cite
|
Sign up to set email alerts
|

Fine-Grained Adversarial Semi-Supervised Learning

Abstract: In this article, we exploit Semi-Supervised Learning ( SSL ) to increase the amount of training data to improve the performance of Fine-Grained Visual Categorization ( FGVC ). This problem has not been investigated in the past in spite of prohibitive annotation costs that FGVC requires. Our approach leverages unlabeled data with an adversarial optimization strategy in which the internal features representation is ob… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 86 publications
0
2
0
Order By: Relevance
“…To reduce the noise of pseudo-labels and improve the quality, Zou et al [51] use label regularization to filter labels with high confidence. To correct its own mistakes of pseudo-label, Mugnai et al [52] propose Gradient Reversal Layer (GRL) for fine-graining the labels. However, the main…”
Section: Self-supervised and Semi-supervised Learningmentioning
confidence: 99%
“…To reduce the noise of pseudo-labels and improve the quality, Zou et al [51] use label regularization to filter labels with high confidence. To correct its own mistakes of pseudo-label, Mugnai et al [52] propose Gradient Reversal Layer (GRL) for fine-graining the labels. However, the main…”
Section: Self-supervised and Semi-supervised Learningmentioning
confidence: 99%
“…The study in [65] designs an SUL GeoNet to jointly estimate the monocular depth map on set of real-time image files. The authors of [47] use Semi-Supervised Learning to increase the quantity of training data available and to improve the performance of Fine-Grained Visual Categorization. Their approach employs unlabeled data and an adversarial optimization strategy in which a second-order pooling model is used to generate the internal feature representation.…”
Section: Real-time Approachesmentioning
confidence: 99%