2017
DOI: 10.48550/arxiv.1708.02556
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multi-Generator Generative Adversarial Nets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
32
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(32 citation statements)
references
References 2 publications
0
32
0
Order By: Relevance
“…It contains 10 balanced classes of images of different objects. Just to further illustrate the merits of multi-agent GANs, we offer a selection of images generated by WGAN-GP [ACB17] and the MGAN [HNLP17].…”
Section: Cifar-10mentioning
confidence: 99%
“…It contains 10 balanced classes of images of different objects. Just to further illustrate the merits of multi-agent GANs, we offer a selection of images generated by WGAN-GP [ACB17] and the MGAN [HNLP17].…”
Section: Cifar-10mentioning
confidence: 99%
“…It is worth mentioning that generators of the above three GAN models are not trained simultaneously, which may cover more modes as well as more poor samples step by step. Ensemble GAN models that train generators simultaneously include Mix+GAN (Arora et al, 2017), MAD-GAN (Ghosh et al, 2018), MGAN (Hoang et al, 2017), MEGAN (Park et al, 2018), andDeLiGAN (Gurumurthy et al, 2017). Mix+GAN has multiple generators and discriminators that have independent neural network parameters and learnable mixed weights, which is computational expensive.…”
Section: Related Workmentioning
confidence: 99%
“…MGAN also adds extra classification loss term in the generator's loss function to force each generator to specialize on different modes in the training set. (Hoang et al, 2017) pointed out that the mixture weights of the generator distributions of MGAN are unreasonably fixed and evenly distributed, and proposed Mixture of Experts GAN (MEGAN). Inside the MEGAN, a learnable Gating Networks based on Straight-Through Gumbel Softmax (Jang et al, 2017) picks one sample from all the generated samples of multiple generators as the output of the model.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Albuquerque et al [1] show that training GAN variants with multiple discriminators is a practical approach even though extra capacity and computational cost are needed. Employing multiple generators and one discriminator to overcome the mode collapse issue and encourages diverse images has also been proposed [17,11].…”
Section: Introductionmentioning
confidence: 99%