2020
DOI: 10.1109/tnnls.2019.2920246
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid Classifier Ensemble for Imbalanced Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 82 publications
(20 citation statements)
references
References 53 publications
0
20
0
Order By: Relevance
“…Several informed over-sampling techniques have also been developed to reduce over-fitting and strength class boundaries. Over-sampling methods tend to be more efficient than under-sampling techniques when handling extremely imbalanced big data problem with large imbalanced ratio [12], [23]- [25]. Chawla et al [11] have introduced a Synthetic Minority Over-sampling Technique (SMOTE); a method for creating synthetic data of minority samples by identifying the feature space for the minority samples and considering their k nearest neighbours.…”
Section: A Data-driven Methodsmentioning
confidence: 99%
“…Several informed over-sampling techniques have also been developed to reduce over-fitting and strength class boundaries. Over-sampling methods tend to be more efficient than under-sampling techniques when handling extremely imbalanced big data problem with large imbalanced ratio [12], [23]- [25]. Chawla et al [11] have introduced a Synthetic Minority Over-sampling Technique (SMOTE); a method for creating synthetic data of minority samples by identifying the feature space for the minority samples and considering their k nearest neighbours.…”
Section: A Data-driven Methodsmentioning
confidence: 99%
“…Zhu et al [31] proposed a geometric structural ensemble learning framework, which partitions and eliminates redundant majority samples by generating hyper-sphere through the Euclidean metric and learns basic classifiers to enclose the minority samples. Yang et al [32] proposed a hybrid ensemble classifier framework that combines density-based undersampling and cost-effective methods using multi-objective optimization algorithm to handle two issues: (1) undersampling methods suffer from losing important information; (2) cost-sensitive methods are sensitive to outliers and noise.…”
Section: Related Workmentioning
confidence: 99%
“…Machine learning and deep learning algorithms are strongly affected by the class imbalance problem [11][12][13][14][15]. The latter refers to some difficulties that appear when the number of samples in one or more classes in the dataset is fewer than another class (or classes), thereby producing an important deterioration of the classifier performance [16].…”
Section: Introductionmentioning
confidence: 99%
“…Another way to deal with the class imbalance problem has been through the Cost Sensitive (CS) approach [34], which has become an important topic in deep learning research in recent years [13][14][15]35]. CS considers the costs associated with misclassifying samples; i.e., it uses different cost matrices describing the costs of misclassifying any particular data sample [29].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation