2019
DOI: 10.48550/arxiv.1901.09024
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Diversity-Sensitive Conditional Generative Adversarial Networks

Abstract: We propose a simple yet highly effective method that addresses the mode-collapse problem in the Conditional Generative Adversarial Network (cGAN). Although conditional distributions are multi-modal (i.e., having many modes) in practice, most cGAN approaches tend to learn an overly simplified distribution where an input is always mapped to a single output regardless of variations in latent code. To address such issue, we propose to explicitly regularize the generator to produce diverse outputs depending on late… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
49
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 54 publications
(50 citation statements)
references
References 16 publications
1
49
0
Order By: Relevance
“…To alleviate mode collapse in GANs, DistanceGAN [2] proposes to preserve the distances between input pairs in the corresponding generated output pairs. A similar scheme has been employed for both unconditional [27,18] and conditional [19,35] generation tasks to increase diversity in the generations. In our work, we aim to inherit the learned diversity from the source model to the target model and achieve this via our novel cross-domain distance consistency loss.…”
Section: Domain Translationmentioning
confidence: 99%
“…To alleviate mode collapse in GANs, DistanceGAN [2] proposes to preserve the distances between input pairs in the corresponding generated output pairs. A similar scheme has been employed for both unconditional [27,18] and conditional [19,35] generation tasks to increase diversity in the generations. In our work, we aim to inherit the learned diversity from the source model to the target model and achieve this via our novel cross-domain distance consistency loss.…”
Section: Domain Translationmentioning
confidence: 99%
“…As our method generates images conditioned on extra inputs, variations of the output images are restricted, especially when a color image is given as a condition input. To enable the generator network to produce semantically diverse images based on the condition input, we regularize the generator network with the diversity-sensitive loss (Yang et al 2019a). This is defined as…”
Section: Pose-consistent Diversity Lossmentioning
confidence: 99%
“…Pose Penalty To validate the importance of the posepenalty in relation to the diversity-sensitive loss (Yang et al 2019a) for our method, we conduct an ablation study to confirm the effect of the pose-penalty when attaching the diversity-sensitive loss when training. As shown in Fig.…”
Section: Analysis Of Experimentsmentioning
confidence: 99%
“…Moreover, we conducted experiments with changed hyperparameter, modifying γ to control the loss weighting (see Fig. 4), or choosing a different architecture such as the Diversity-Sensitive Conditional GAN [55], which is designed to force variability through a different structure and loss functionality. None of these attempts is successful in enforcing more diversity in the generated images.…”
Section: Weekly Development Of Reference Plantsmentioning
confidence: 99%