2020
DOI: 10.1007/978-3-030-63823-8_77
|View full text |Cite
|
Sign up to set email alerts
|

STM-GAN: Sequentially Trained Multiple Generators for Mitigating Mode Collapse

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(13 citation statements)
references
References 4 publications
0
13
0
Order By: Relevance
“…Consequently, it suffices to preserve a limited number of instances from previous flows to prevent bias (i.e., set a threshold for the maximum number of previous benign/attack samples). Furthermore, another practical approach for reproducing previous samples would be using Generative Adversarial Networks (GAN) that can support continuous updating to new data [51,52,53,54].…”
Section: Data Samplingmentioning
confidence: 99%
“…Consequently, it suffices to preserve a limited number of instances from previous flows to prevent bias (i.e., set a threshold for the maximum number of previous benign/attack samples). Furthermore, another practical approach for reproducing previous samples would be using Generative Adversarial Networks (GAN) that can support continuous updating to new data [51,52,53,54].…”
Section: Data Samplingmentioning
confidence: 99%
“…These approaches are categorized as follows: a. Building GAN model based on multiple generators or multiple discriminators or both (Bhagyashree et al, 2020;Chavdarova & Fleuret, 2018;Mangalam & Garg, 2021;Mordido et al, 2018;Mu et al, 2022;Varshney et al, 2020;K. Zhang, 2021;Z.…”
Section: Training Instabilitymentioning
confidence: 99%
“…In parallel to the local set pair training, the global discriminator is optimized to detect samples generated by any of the local generators, and similarly, the global generator is optimized to cheat all of the local discriminators. Likely, Varshney et al (2020) proposed a GAN model using multiple generators and a single discriminator. Each generator is forced to learn a different mode through a proposed loss function.…”
Section: Gan Challengesmentioning
confidence: 99%
See 1 more Smart Citation
“…In which the first neural network generates a set of samples and the second attempts to distinguish between different classes of generated and real dataset. [32][33][34][35] These generated features of a massive dataset benefit a wide range of analysis tasks such as link prediction, recommendation, and node classification. 36,37 Existing models 38,39 used a GAN model to efficiently learn the complex joint probability of all the nodes and edges from a synthesized graph.…”
Section: Limitations Of Existing Gan Modelsmentioning
confidence: 99%