“…To improve the generator distribution, many methods have been proposed and they can be roughly divided into three categories: (1) Improving the training approach, such as Unrolled GAN (Metz et al, 2017) or WGAN-GP (Gulrajani et al, 2017), etc., to overcome the problem of mode dropping by stabilizing GAN's training; (2) Using label conditioning, such as the conditional LAPGAN (Denton et al, 2015), AC-GAN (Odena et al, 2017), and cGAN (Mirza & Osindero, 2014), which can always significantly improve the sample quality. (Goodfellow, 2016;Salimans et al, 2016); (3) Ensembling multiple GANs, such as MGAN (Hoang et al, 2017), AdaGAN (Tolstikhin et al, 2017), Mix+GAN (Arora et al, 2017), and MAD-GAN (Ghosh et al, 2018), to cover more modes and improve the fidelity of generator distribution. However, conditional GANs can not be trained on unlabeled data sets.…”