2021
DOI: 10.48550/arxiv.2101.11186
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Evolutionary Generative Adversarial Networks with Crossover Based Knowledge Distillation

Abstract: Generative Adversarial Networks (GAN) is an adversarial model, and it has been demonstrated to be effective for various generative tasks. However, GAN and its variants also suffer from many training problems, such as mode collapse and gradient vanish. In this paper, we firstly propose a general crossover operator, which can be widely applied to GANs using evolutionary strategies. Then we design an evolutionary GAN framework C-GAN based on it. And we combine the crossover operator with evolutionary generative a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 20 publications
0
3
0
Order By: Relevance
“…E-GAN only uses different objective functions for the mutation part. Li et al (2021) improved the E-GAN algorithm. The improvement is that the crossover part is added to it, and the C-GAN framework including the crossover operator is proposed.…”
Section: Ceganmentioning
confidence: 99%
See 1 more Smart Citation
“…E-GAN only uses different objective functions for the mutation part. Li et al (2021) improved the E-GAN algorithm. The improvement is that the crossover part is added to it, and the C-GAN framework including the crossover operator is proposed.…”
Section: Ceganmentioning
confidence: 99%
“…It can construct a guided vector (GV) with Fig. 20 Architecture of GAO good direction and adaptive length, and then add the guided vector to the corresponding fireworks position to generate an elite solution called guided spark (GS) (Li et al 2019). In GAO, different from other EAs using random sampling to generate elite solutions or guide vectors, the introduction of GANs increases the variability of the algorithm.…”
Section: Generative Adversarial Optimization (Gao) and Its Variationmentioning
confidence: 99%
“…Besides KD, several works [14], [16], [19] exploit network pruning. In addition, there exist several works [24], [25] developing evolutionary compression with inferior results compared with KD-based approaches.…”
Section: Related Workmentioning
confidence: 99%
“…We train the student StyleGAN2 on FFHQ with channel multiplier 1/2 via the objective function Eq. (24) or Eq. ( 25) with the similar training techniques for DGL-GAN and report the results in Table 7.…”
Section: Tablementioning
confidence: 99%
See 1 more Smart Citation