2018
DOI: 10.1007/978-3-030-01231-1_14
|View full text |Cite
|
Sign up to set email alerts
|

Transferring GANs: Generating Images from Limited Data

Abstract: Transferring knowledge of pre-trained networks to new domains by means of fine-tuning is a widely used practice for applications based on discriminative models. To the best of our knowledge this practice has not been studied within the context of generative deep networks. Therefore, we study domain adaptation applied to image generation with generative adversarial networks. We evaluate several aspects of domain adaptation, including the impact of target domain size, the relative distance between source and tar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
196
2
1

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 170 publications
(201 citation statements)
references
References 27 publications
2
196
2
1
Order By: Relevance
“…We used the SNGAN 1 www.gwern.net/Danbooru2018 model used in [21]. Transfer GAN [35]: The pre-trained generator and discriminator are fine-tuned on a small dataset. Transfer GAN (scale and shift): This method is similar to our method, but does not apply supervised training, and instead uses unsupervised training with the discriminator.…”
Section: Image Generation From a Small Dataset Using Comparison Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…We used the SNGAN 1 www.gwern.net/Danbooru2018 model used in [21]. Transfer GAN [35]: The pre-trained generator and discriminator are fine-tuned on a small dataset. Transfer GAN (scale and shift): This method is similar to our method, but does not apply supervised training, and instead uses unsupervised training with the discriminator.…”
Section: Image Generation From a Small Dataset Using Comparison Methodsmentioning
confidence: 99%
“…In this subsection, we conducted an experiment to investigate when the generator can be transferred to the target domain. As discussed in 3.2, the diversity of the filter acquired in the pre-trained generator is thought to affect the Anime face Flower Figure 9: Comparison of FID on the anime face dataset (left) and KMMD on the flower dataset (right) between our method, Transfer GAN [35], and "Update all". Note that it is meaningless to compare the performance between different dataset sizes because the data distributions for each dataset size are different.…”
Section: Source Domain Selectionmentioning
confidence: 99%
See 1 more Smart Citation
“…Recent progresses in domain adaption alleviate the need of parallel corpora [6,9,10]. In an application to image learning, domain adaptation through GAN has shown benefit to transfer the models from other dataset as pre-training models when training on smaller dataset [32], which provides the technical foundation for our work.…”
Section: Related Workmentioning
confidence: 99%
“…This paper explores the intersection of generative adversarial networks (GANs) and interactive evolutionary computation (IEC) within the context of artistic artifact generation and creative expression. GANs are state-of-the-art generative models that have been applied in artifact generation involving video, audio, 3D models, virtual ambiance and videogames [ 11 , 12 , 13 , 14 ], but have found most of their success in the generation of 2D images [ 15 ]. GANs generate new artifacts by sampling from a learned latent space; however, this sampling process is mostly done stochastically and offers little control over the final output.…”
Section: Introductionmentioning
confidence: 99%