Proceedings of the 23rd International Conference on Machine Learning - ICML '06 2006
DOI: 10.1145/1143844.1143921
|View full text |Cite
|
Sign up to set email alerts
|

Pruning in ordered bagging ensembles

Abstract: Esta es la versión de autor de la comunicación de congreso publicada en: This is an author produced version of a paper published in: AbstractWe present a novel ensemble pruning method based on reordering the classifiers obtained from bagging and then selecting a subset for aggregation. Ordering the classifiers generated in bagging makes it possible to build subensembles of increasing size by including first those classifiers that are expected to perform best when aggregated. Ensemble pruning is achieved by ha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
84
0
1

Year Published

2006
2006
2020
2020

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 130 publications
(86 citation statements)
references
References 19 publications
1
84
0
1
Order By: Relevance
“…After obtaining the component learners, most ensemble algorithms combine all of them to build an ensemble, however, it has been shown that it is better to ensemble some instead of all of them [24,23,17].…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…After obtaining the component learners, most ensemble algorithms combine all of them to build an ensemble, however, it has been shown that it is better to ensemble some instead of all of them [24,23,17].…”
Section: Related Workmentioning
confidence: 99%
“…Martínez-Muñoz and Suárez [17] proposed a heuristic method, where the component learners obtained from bagging are reordered, and a part of the top-ranked ones are included in the final ensemble. The experimental result also shows that selective ensemble may improve ensemble's performance while reducing its size.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…However, the ensemble pruning is a difficult problem whose solution is commonly computationally expensive. Pruning an ensemble with n models requires searching in the space of the 2 n − 1 non-empty solutions to minimize a cost function correlated with the generalization error [18].…”
Section: Introductionmentioning
confidence: 99%