2020
DOI: 10.1016/j.eswa.2020.113660
|View full text |Cite
|
Sign up to set email alerts
|

An ensemble imbalanced classification method based on model dynamic selection driven by data partition hybrid sampling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 50 publications
(23 citation statements)
references
References 39 publications
0
23
0
Order By: Relevance
“…With the extensive application of ensemble approaches, it has become an important issue for designing a more efficient ensemble classification algorithm. Compared with static ensemble algorithms, dynamic selection ensemble algorithms [ 71 , 72 , 73 , 74 , 75 , 76 , 77 , 78 , 79 , 80 , 81 , 82 , 83 ] have been shown to effectively improve the F-measure and G-mean values. A dynamic selection ensemble algorithm predicts the label of the test sample by evaluating the capability level of each classifier and selects the set of the most capable or competitive classifiers.…”
Section: Ensemble Approaches For Imbalanced Classificationmentioning
confidence: 99%
“…With the extensive application of ensemble approaches, it has become an important issue for designing a more efficient ensemble classification algorithm. Compared with static ensemble algorithms, dynamic selection ensemble algorithms [ 71 , 72 , 73 , 74 , 75 , 76 , 77 , 78 , 79 , 80 , 81 , 82 , 83 ] have been shown to effectively improve the F-measure and G-mean values. A dynamic selection ensemble algorithm predicts the label of the test sample by evaluating the capability level of each classifier and selects the set of the most capable or competitive classifiers.…”
Section: Ensemble Approaches For Imbalanced Classificationmentioning
confidence: 99%
“…According to the proportion of minority samples in neighborhoods of each minority sample, the data space is divided into five regions [12]: the boundary minority samples region, the noise minority samples region, the safe minority samples region, the boundary majority samples region, and the safe majority samples region, as shown in Figure 2.…”
Section: Data Partitionmentioning
confidence: 99%
“…When tackling the above-mentioned datasets, traditional classification models usually have better performance on the majority class rather than the minority class, while the minority class is usually more important than the majority class. To improve classification model performance for the minority class of imbalanced dataset, many imbalance learning algorithms have been proposed [3][4][5][6].…”
Section: Ir = # # ⁄mentioning
confidence: 99%
“…In addition, imbalance learning has been listed as one of the top ten difficult problems in the field of data mining at ICDM'05 [7]. With the development of big data and deep learning techniques, we will face much more challenges in imbalance learning field [6,8], e.g., imbalance learning algorithm for big data processing platform, the new approach for synthetic minority sampling, and different tradeoff learning strategies for high IR datasets [8]. The existing imbalance learning algorithms can be divided into three groups [9,10].…”
Section: Ir = # # ⁄mentioning
confidence: 99%