2022
DOI: 10.1109/tpami.2021.3099829
|View full text |Cite
|
Sign up to set email alerts
|

AlphaGAN: Fully Differentiable Architecture Search for Generative Adversarial Networks

Abstract: Generative Adversarial Networks (GANs) are formulated as minimax game problems, whereby generators attempt to approach real data distributions by virtue of adversarial learning against discriminators. The intrinsic problem complexity poses the challenge to enhance the performance of generative networks. In this work, we aim to boost model learning from the perspective of network architectures, by incorporating recent progress on automated architecture search into GANs. To this end, we propose a fully different… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(11 citation statements)
references
References 42 publications
0
7
0
Order By: Relevance
“…Since GAs can tackle difficult problems that cannot be directly solved, many GA-based algorithms for the NAS have emerged recently [74]- [77]. Although some NAS methods have been introduced to automatically find network architectures for generative models (e.g., GANs) [41], [43], [45], [46], ENAS-based GANs lack of enough attention [47]. Ying [42] and Liu [61] initially introduce an evolutionary algorithmbased NAS framework to search GANs stably, but the models are based on the unconditional image generation task.…”
Section: Evolutionary Neural Architecture Searchmentioning
confidence: 99%
See 1 more Smart Citation
“…Since GAs can tackle difficult problems that cannot be directly solved, many GA-based algorithms for the NAS have emerged recently [74]- [77]. Although some NAS methods have been introduced to automatically find network architectures for generative models (e.g., GANs) [41], [43], [45], [46], ENAS-based GANs lack of enough attention [47]. Ying [42] and Liu [61] initially introduce an evolutionary algorithmbased NAS framework to search GANs stably, but the models are based on the unconditional image generation task.…”
Section: Evolutionary Neural Architecture Searchmentioning
confidence: 99%
“…Based on the gradient optimization, Gao et al [44] proposed the AdversarialNAS to search generator and discriminator simultaneously with an adversarial loss function. Meanwhile, Tian et al [45] introduced a fully differentiable search framework for the searching generator and discriminator of GAN. Since the architectures of the generator and discriminator in AdversarialNAS are deeply coupled, the search complexity and the instability of GAN training are increased.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, the developed method was found to be very efficient in finding superior generative models in the proposed large search space, proving its performance and superiority. In a 2021 study, Tian et al [ 117 ] proposed a fully differentiable search framework, also known as alphaGAN. The search process was built as equations to solve bi-level minimax optimization problems.…”
Section: Nas For CVmentioning
confidence: 99%
“…There are various metrics used to measure the divergence between the probability distributions, including Jensen–Shannon (JS) divergence (Goodfellow et al, 2014), Wasserstein distance (Arjovsky et al, 2017), and maximum mean discrepancy (MMD; Huang et al, 2021). The diversity of the objective functions leads to several extensions of GANs like Wasserstein GAN (WGAN; Arjovsky et al, 2017), LSGAN (Mao et al, 2017), Cramér GAN (Bellemare et al, 2017), f‐GAN (Nowozin et al, 2016), AlphaGAN (Y. Tian et al, 2020), and αβ‐GAN (Gnanha et al, 2022). On the other side, different validation metrics are used to judge the quality and diversity of the generated synthetic data.…”
Section: Introductionmentioning
confidence: 99%