2018
DOI: 10.1016/j.neucom.2018.01.060
|View full text |Cite
|
Sign up to set email alerts
|

A study on combining dynamic selection and data preprocessing for imbalance learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
52
0
3

Year Published

2018
2018
2020
2020

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 98 publications
(56 citation statements)
references
References 47 publications
1
52
0
3
Order By: Relevance
“…Thus, making it easier to replicate the results of this paper. The process of creating the dynamic selection dataset (DSEL) was guided by the experiments conducted in [22].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, making it easier to replicate the results of this paper. The process of creating the dynamic selection dataset (DSEL) was guided by the experiments conducted in [22].…”
Section: Discussionmentioning
confidence: 99%
“…This could be explained by the fact the Random Balance was proposed to deal specifically with small sized and imbalanced data [42], which comprises the 64 datasets in this study. Moreover, this technique achieved the state-of-theart performance for such datasets in several comparative studies [29,22]. Hence, the FKNE++ is competitive with the state-of-the-art methods for dealing with small sized and imbalanced datasets.…”
Section: Comparison Among Different Scenariosmentioning
confidence: 95%
“…The use of sampling techniques in conjunction with ensemble learning is usually more straightforward than combining them to boosting because the bagging method does not require to compute weights and adapt the update formulas of weights during the learning process [52]. Many studies integrating random undersampling into bagging may be explored in [53].…”
Section: Overview Of the Methodsmentioning
confidence: 99%
“…As shown in [6], a diverse ensemble can better cope with imbalanced distribution. In particular, Dynamic selection (DS) techniques is seen as an alternative to deal with multi-class imbalance as it explores the local competence of each base classifier according to each new query sample [7,2,8]. Only the base classifiers that attained a certain competence level, in the given local region, are selected to predict the label of the query sample.…”
Section: Introductionmentioning
confidence: 99%