2020
DOI: 10.1016/j.patrec.2020.03.004
|View full text |Cite
|
Sign up to set email alerts
|

Adjusting the imbalance ratio by the dimensionality of imbalanced data

Abstract: Class-imbalance extent metrics measure how imbalanced the data are. In pattern classification, it is usually expected that the higher the imbalance extent, the worse the classification performance, and thus an appropriate imbalance extent metric should show a negative correlation with the classification performance. Existing metrics, such as the popular imbalance ratio (IR), only consider the e↵ect of the sample sizes of di↵erent classes. However, we note that the dimensionality of imbalanced data also a↵ects … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 57 publications
(21 citation statements)
references
References 10 publications
(12 reference statements)
0
21
0
Order By: Relevance
“…Here, the ratio of the majority sample number to the minority sample is 6.5. By the SMOTE technique, this ratio was reduced to 1 [9] , [10] . The SMOTE method is based on the algorithm which is given below [32] .…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Here, the ratio of the majority sample number to the minority sample is 6.5. By the SMOTE technique, this ratio was reduced to 1 [9] , [10] . The SMOTE method is based on the algorithm which is given below [32] .…”
Section: Methodsmentioning
confidence: 99%
“…The most widely used class imbalance measure in the literature is calculated as the ratio of the sample numbers of the largest majority class and the smallest minority class and is called the imbalance ratio. The higher this ratio is the greater the imbalance scope of the dataset and causes over fitting problem in the classification process and decreases performance [9] , [10] . A widely used method to eliminate the imbalance between data classes encountered by Deep Learning classifier models is SMOTE (Synthetic Minority Over-sampling Technique) method.…”
Section: Introductionmentioning
confidence: 99%
“…The nearly equal numbers of five different arrhythmias which are N, LBBB, RBBB, PVC, and APB are selected to balance the dataset and classified to achieve the highest accuracy. Recent studies have revealed that data augmentation techniques and providing an equal number of beat samples among classes of the dataset can be used as approaches to stabilize the imbalance ratio (IR) [37].…”
Section: Ecg Databasementioning
confidence: 99%
“…The imbalance ratio [18] is defined as R= Smaj/Smin where Smaj is the number of majority class samples, and Smin is the number of samples of the minority class [19]. The greater the value of R, the higher is the imbalance.…”
Section: Processing Skewed Datasetmentioning
confidence: 99%