2015
DOI: 10.1016/j.ins.2015.07.025
|View full text |Cite
|
Sign up to set email alerts
|

Diversity techniques improve the performance of the best imbalance learning ensembles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
100
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 174 publications
(101 citation statements)
references
References 39 publications
1
100
0
Order By: Relevance
“…3 -5. There are some differences in positioning of different algorithms by various measures. This fact was also pointed out by T. Raeder et al in [55], J. J. Rodriguez et al in [56] and J. F. Dez-Pastor et al in [12]. In our tests the highest average ranking position has the RandomForest (RF) classifier regardless of adopted measure.…”
Section: Experiments #2supporting
confidence: 85%
“…3 -5. There are some differences in positioning of different algorithms by various measures. This fact was also pointed out by T. Raeder et al in [55], J. J. Rodriguez et al in [56] and J. F. Dez-Pastor et al in [12]. In our tests the highest average ranking position has the RandomForest (RF) classifier regardless of adopted measure.…”
Section: Experiments #2supporting
confidence: 85%
“…S1). However, in SMOTE algorithm, the same number of the synthetic data are generated for each minority sample without consideration to neighboring samples, which may increase the overlapping between classes273132. More details about SMOTE algorithm can be found in ref.…”
Section: Theory and Methodsmentioning
confidence: 99%
“…The current work is strongly motivated by previously published approaches 29,49 for ensemble construction. Diversity and quality are two important issues for the performance of ensemble models.…”
Section: Adaptive Ensemble Classication Framework (Aecf)mentioning
confidence: 99%
“…However, the benets of these preprocess techniques may vary in characteristics of datasets. 20,29 Furthermore, some potential useful data may be omitted when modeling on a very small balanced subset from the original data by some methods such as the undersampling based approaches. 30 The algorithm level methods reduce the sensitiveness to class unbalance by modications of existing classication algorithms.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation