2018
DOI: 10.1016/j.eswa.2017.09.030
|View full text |Cite
|
Sign up to set email alerts
|

Effective data generation for imbalanced learning using conditional generative adversarial networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
196
0
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 429 publications
(199 citation statements)
references
References 24 publications
1
196
0
2
Order By: Relevance
“…Such approaches generally use convolutional neural networks (CNNs) [25][26][27] or conditional generative adversarial networks (cGANs) [28][29][30], where CNNs are based on the application of a single NN, and cGANS are based on two NNs that are trained via an adversarial methodology [31][32][33][34][35].…”
Section: Introductionmentioning
confidence: 99%
“…Such approaches generally use convolutional neural networks (CNNs) [25][26][27] or conditional generative adversarial networks (cGANs) [28][29][30], where CNNs are based on the application of a single NN, and cGANS are based on two NNs that are trained via an adversarial methodology [31][32][33][34][35].…”
Section: Introductionmentioning
confidence: 99%
“…Recently a neural network-based oversampling approach has been used in some works in the literature to generate minority class instances. In these works, unlike SMOTE-based methods that do not take into account the data distribution, generative adversarial networks (GAN) algorithms are used for learning class-dependent data distribution and produce the generative model [16]. These techniques are mainly applied in the image processing area, where several GAN variants have been implemented in order to generate synthetic images [17].…”
Section: Related Workmentioning
confidence: 99%
“…Of the advanced over-sampling methods, some typical or recently proposed examples are the SMOTE (Chawla et al 2002), MWMOTE (majority weighted minority over-sampling) (Barua et al 2014), graph-based over-sampling (Pé rez-Ortiz et al 2015), diversity-based over-sampling (Bennin et al 2018), and generative adversarial network-based over-sampling (Douzas and Bacao 2018) techniques. Among them, the SMOTE is the most established and widely used over-sampling method, which creates synthetic minority samples by interpolating existing minority samples along the line segments joining the k minority class nearest neighbours.…”
Section: Resampling Techniques For Class Imbalancementioning
confidence: 99%