2018 International Joint Conference on Neural Networks (IJCNN) 2018
DOI: 10.1109/ijcnn.2018.8489387
|View full text |Cite
|
Sign up to set email alerts
|

Few-shot Classifier GAN

Abstract: Fine-grained image classification with a few-shot classifier is a highly challenging open problem at the core of a numerous data labeling applications. In this paper, we present Few-shot Classifier Generative Adversarial Network as an approach for few-shot classification. We address the problem of few-shot classification by designing a GAN in which the discriminator and the generator compete to output labeled data in any case. In contrast to previous methods, our techniques generate then classify images into m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
2
1
1

Relationship

2
8

Authors

Journals

citations
Cited by 25 publications
(17 citation statements)
references
References 13 publications
0
17
0
Order By: Relevance
“…The early version of this architecture, often called Vanilla GAN [35], uses the Kullback-Leibler divergence as the distribution similarity to produce false images. Some other architectures presented in the literature are C-GAN [36], S-GAN [37], AC-GAN [38], and F-SGAN [39], with an ever-growing market in the research community (Hindupuravinash, 'The GAN zoo', Github, hindupuravinash/the-gan-zoo: A list of all named GANs! (github.com)).…”
Section: Ganmentioning
confidence: 99%
“…The early version of this architecture, often called Vanilla GAN [35], uses the Kullback-Leibler divergence as the distribution similarity to produce false images. Some other architectures presented in the literature are C-GAN [36], S-GAN [37], AC-GAN [38], and F-SGAN [39], with an ever-growing market in the research community (Hindupuravinash, 'The GAN zoo', Github, hindupuravinash/the-gan-zoo: A list of all named GANs! (github.com)).…”
Section: Ganmentioning
confidence: 99%
“…The early version of this architecture, often called Vanilla GAN [36], uses the Kullback-Leibler divergence as the distribution similarity to produce the false images. Some other architectures presented in the literature are C-GAN [37], S-GAN [38], AC-GAN [39] and F-SGAN [40], with an ever-growing market in the research community 7 Figure 13. GAN architecture [41] These networks are typically used to generate more data samples so that the data used to train classification systems can be improved.…”
Section: Ganmentioning
confidence: 99%
“…GANs have been successfully applied to different problems including image generation [48,49], segmentation and speech synthesis. In recent years they were also successfully applied to handle class-imbalance problems [50,51].…”
Section: Gan Modelsmentioning
confidence: 99%