2020
DOI: 10.1016/j.ins.2019.10.014
|View full text |Cite
|
Sign up to set email alerts
|

Conditional Wasserstein generative adversarial network-gradient penalty-based approach to alleviating imbalanced data classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
52
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 128 publications
(53 citation statements)
references
References 29 publications
1
52
0
Order By: Relevance
“…As a reference, we used a dedicatedly trained, conditional Wasserstein-GAN (cW-GAN) [12,33]. This network used the same architecture and training routine as for the unconstrained GAN above.…”
Section: Conditional Digit Generationmentioning
confidence: 99%
“…As a reference, we used a dedicatedly trained, conditional Wasserstein-GAN (cW-GAN) [12,33]. This network used the same architecture and training routine as for the unconstrained GAN above.…”
Section: Conditional Digit Generationmentioning
confidence: 99%
“…Wasserstein CGAN (WCGAN) is often discussed under WC-GAN as in [63] or CWGAN as in [64], [65]. Similar to CGAN discussed in Section IV-C, the generator and discriminator in WCGAN can be conditioned on the failure class labels y, as auxiliary information.…”
Section: E Wasserstein Cgan (Wcgan)mentioning
confidence: 99%
“…Zheng et al [64] have used WCGAN-GP as an oversampling approach to generate realistic minority samples based on learning the real distribution of available minority samples. In comparison to WGAN-GP, the authors show that incorporating the class label in the WCGAN-GP increases the quality of synthetic generative data.…”
Section: E Wasserstein Cgan (Wcgan)mentioning
confidence: 99%
“…For instance, inspired by the idea of AC-GAN (Auxiliary Classifier-GAN) [34], Ali-Gombe and Elyan proposed an improved model MFC-GAN (Multiple Fake Class-GAN) [35] and used the MFC-GAN to handle imbalanced data classification problem. Zheng et al [36] proposed a synthetic oversampling approach for imbalanced data sets.…”
Section: Related Workmentioning
confidence: 99%