2009
DOI: 10.1109/tpami.2008.204
|View full text |Cite
|
Sign up to set email alerts
|

Statistical Instance-Based Pruning in Ensembles of Independent Classifiers

Abstract: The global prediction of a homogeneous ensemble of classifiers generated in independent applications of a randomized learning algorithm on a fixed training set is analyzed within a Bayesian framework. Assuming that majority voting is used, it is possible to estimate with a given confidence level the prediction of the complete ensemble by querying only a subset of classifiers. For a particular instance that needs to be classified, the polling of ensemble classifiers can be halted when the probability that the p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
64
0

Year Published

2009
2009
2018
2018

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 48 publications
(65 citation statements)
references
References 13 publications
1
64
0
Order By: Relevance
“…, l}, and T k * t > T j for j = k * t , where k * t is the majority class after querying the first t classifiers. See [6] for further details.…”
Section: Statistical Instance-based Ensemble Pruningmentioning
confidence: 99%
See 3 more Smart Citations
“…, l}, and T k * t > T j for j = k * t , where k * t is the majority class after querying the first t classifiers. See [6] for further details.…”
Section: Statistical Instance-based Ensemble Pruningmentioning
confidence: 99%
“…Therefore, for a given T there are only (T + 1)T /2 different outcomes for (6) and T + 1 for (5). This is a fairly small number of values that can be easily precomputed and stored in an auxiliary table.…”
Section: Optimizationsmentioning
confidence: 99%
See 2 more Smart Citations
“…However, larger ensembles have larger storage needs and longer times of prediction. To alleviate these shortcomings, different ensemble pruning methods can be used [4][5][6][7][8][9][10][11][12][13]. The goal of these methods is to reduce the memory requirements and to speed-up the classification process while maintaining or, if possible, improving the level of accuracy of the original ensemble.…”
Section: Introductionmentioning
confidence: 99%