2018
DOI: 10.1007/s11704-016-5306-z
|View full text |Cite
|
Sign up to set email alerts
|

Evolutionary under-sampling based bagging ensemble method for imbalanced data classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 67 publications
(31 citation statements)
references
References 43 publications
0
31
0
Order By: Relevance
“…Ensemble learning is one of the most popular methods at present. It has near-optimal classification methods for any problem, and it can achieve better generalization performance than a single classifier by training multiple individual classifiers and combining them together [8,[20][21][22][23][24][25][26][27][28][29][30][31][32]. There are two main approaches of ensemble learning: Bagging and Boosting.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Ensemble learning is one of the most popular methods at present. It has near-optimal classification methods for any problem, and it can achieve better generalization performance than a single classifier by training multiple individual classifiers and combining them together [8,[20][21][22][23][24][25][26][27][28][29][30][31][32]. There are two main approaches of ensemble learning: Bagging and Boosting.…”
Section: Related Workmentioning
confidence: 99%
“…Minaei-Bidgoli [23] proposed an ensemble-based approach for feature selection in order to overcome the problem of parameter sensitivity of feature selection approaches. In many cases, it is better to combine resampling with ensemble learning [26][27][28][29][30][31][32]. Kang et al [26] proposed EUS (ensemble undersampling), which selects the same number of samples from the majority class as the minority class to form several balanced subsets, and trains SVM-based classifiers for each subset to overcome the problem of information loss in undersampling to a certain extent.…”
Section: Related Workmentioning
confidence: 99%
“…Leo Breiman proposed Bagging algorithm in 1996 [20] and the deviation variance theory is used to make a complete and reasonable theoretical explanation of its effectiveness [26], [27]. The basic idea of Bagging algorithm is below: Given a weak classifier and a training set, the weak classification algorithm is used to classify the training samples.…”
Section: Classification Prediction By Bp Neural Network Ensemble Mmentioning
confidence: 99%
“…Additionally, the researchers also note that k-nearest neighbor based bagging algorithm gave the best results and hence commented that in cases involving big unbalanced credit coring datasets [56]. Other applications of bagging include network intrusion detection to enhance true positives and reduce false negatives [50], credit card fraud detection [51], medical diagnosis of arrhythmia beats [52], Urban traffic flow forecasting [53], forecasting of wind and solar power [54] and imbalanced data classification using Evolutionary under-sampling bootstrap aggregation models [55].…”
Section: How Bagging Workmentioning
confidence: 99%