2020
DOI: 10.1109/tcyb.2018.2877663
|View full text |Cite
|
Sign up to set email alerts
|

Geometric Structural Ensemble Learning for Imbalanced Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
26
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 102 publications
(30 citation statements)
references
References 36 publications
0
26
0
Order By: Relevance
“…Wong et al [30] proposed a cost-sensitive stacked denoising autoencoder ensemble method, and applied to address class imbalance problems in business domain. Zhu et al [31] proposed a geometric structural ensemble learning framework, which partitions and eliminates redundant majority samples by generating hyper-sphere through the Euclidean metric and learns basic classifiers to enclose the minority samples. Yang et al [32] proposed a hybrid ensemble classifier framework that combines density-based undersampling and cost-effective methods using multi-objective optimization algorithm to handle two issues: (1) undersampling methods suffer from losing important information; (2) cost-sensitive methods are sensitive to outliers and noise.…”
Section: Related Workmentioning
confidence: 99%
“…Wong et al [30] proposed a cost-sensitive stacked denoising autoencoder ensemble method, and applied to address class imbalance problems in business domain. Zhu et al [31] proposed a geometric structural ensemble learning framework, which partitions and eliminates redundant majority samples by generating hyper-sphere through the Euclidean metric and learns basic classifiers to enclose the minority samples. Yang et al [32] proposed a hybrid ensemble classifier framework that combines density-based undersampling and cost-effective methods using multi-objective optimization algorithm to handle two issues: (1) undersampling methods suffer from losing important information; (2) cost-sensitive methods are sensitive to outliers and noise.…”
Section: Related Workmentioning
confidence: 99%
“…In summary, in recent years, in addition to traditional machine learning and deep learning methods, the classification method based on ensemble learning [25,26] has gradually attracted the attention of many researchers and achieved good results in the emotion recognition of EEG signals. However, some problems still exist in this kind of method, such as features extracted from EEG signals which are not so typical and can reflect emotional information well, and the performance of models need to be improved.…”
Section: Introductionmentioning
confidence: 99%
“…The third is hybrid approach, which combines the sampling methods with the algorithm‐level approaches to form an ensemble system or a neural network 4 . Ensemble methods 14–16 improve the classification performance by training several different classifiers and combining their predictions to output a single‐class label. However, since the overall accuracy is not appropriate for the imbalanced data sets, 17 the ensemble method is not enough to deal with the imbalanced situations.…”
Section: Introductionmentioning
confidence: 99%