2021
DOI: 10.1016/j.neucom.2020.01.119
|View full text |Cite
|
Sign up to set email alerts
|

Imbalanced data learning by minority class augmentation using capsule adversarial networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
50
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 51 publications
(50 citation statements)
references
References 43 publications
0
50
0
Order By: Relevance
“…In addition, the proposed generator could be adopted in other application fields for the modeling of visual information, such as video captioning and action recognition. Finally, there were several recent research works dealing with the problem of complex and imbalanced data in GAN networks [ 60 , 61 , 62 , 63 , 64 ]. Although the study of this problem is out of the scope of this paper, we consider that future work in this direction can further improve the accuracy of the proposed network architecture.…”
Section: Discussionmentioning
confidence: 99%
“…In addition, the proposed generator could be adopted in other application fields for the modeling of visual information, such as video captioning and action recognition. Finally, there were several recent research works dealing with the problem of complex and imbalanced data in GAN networks [ 60 , 61 , 62 , 63 , 64 ]. Although the study of this problem is out of the scope of this paper, we consider that future work in this direction can further improve the accuracy of the proposed network architecture.…”
Section: Discussionmentioning
confidence: 99%
“…Here, the ratio of the majority sample number to the minority sample is 6.5. By the SMOTE technique, this ratio was reduced to 1 [9] , [10] . The SMOTE method is based on the algorithm which is given below [32] .…”
Section: Methodsmentioning
confidence: 99%
“…The most widely used class imbalance measure in the literature is calculated as the ratio of the sample numbers of the largest majority class and the smallest minority class and is called the imbalance ratio. The higher this ratio is the greater the imbalance scope of the dataset and causes over fitting problem in the classification process and decreases performance [9] , [10] . A widely used method to eliminate the imbalance between data classes encountered by Deep Learning classifier models is SMOTE (Synthetic Minority Over-sampling Technique) method.…”
Section: Introductionmentioning
confidence: 99%
“…Thus, people are directed to make augmentation of any data entering any system. Therefore, we will provide a method that benefits researchers to make augmentation of data so that they can deal with it later when it enters into deep learning and solves too many problems such as the problem of overfitting and memo, especially if these databases are imbalanced [15,16].…”
Section: Small Data and Augmentationmentioning
confidence: 99%