2019
DOI: 10.1007/s10618-019-00638-y
|View full text |Cite
|
Sign up to set email alerts
|

A probabilistic classifier ensemble weighting scheme based on cross-validated accuracy estimates

Abstract: Our hypothesis is that building ensembles of small sets of strong classifiers constructed with different learning algorithms is, on average, the best approach to classification for real-world problems. We propose a simple mechanism for building small heterogeneous ensembles based on exponentially weighting the probability estimates of the base classifiers with an estimate of the accuracy formed through cross-validation on the train data. We demonstrate through extensive experimentation that, given the same sma… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
51
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
5
4

Relationship

3
6

Authors

Journals

citations
Cited by 76 publications
(55 citation statements)
references
References 25 publications
0
51
0
Order By: Relevance
“…The predictions are combined using a stacking scheme that includes categorization designation based on highest aggregate voting, votes weighted by prediction probability, and votes weighted on algorithm performance. Thus, the predictions of the set of individual algorithms are then combined to produce a single class membership prediction for each student response and rubric bin (Large et al 2019).…”
Section: Machine Learning Model Developmentmentioning
confidence: 99%
“…The predictions are combined using a stacking scheme that includes categorization designation based on highest aggregate voting, votes weighted by prediction probability, and votes weighted on algorithm performance. Thus, the predictions of the set of individual algorithms are then combined to produce a single class membership prediction for each student response and rubric bin (Large et al 2019).…”
Section: Machine Learning Model Developmentmentioning
confidence: 99%
“…The majority voting is an ensemble strategy that selects one of many alternatives based on the predicted classes with the most votes. AdaBoosting is a boosting meta-algorithm which iteratively re-weights based on the training error of the base classifier (22) . The proposed framework consists of two main components viz.…”
Section: Proposed Voting-boosting Ensemble Modelmentioning
confidence: 99%
“…Uses simple statistics (like calculating the mean) to combine predictions [4]. It is also possible to take the output of the base learners on the training data and apply another learning algorithm on them to predict the response values [8].…”
Section: Introductionmentioning
confidence: 99%
“…Whilst voting/stacking reduces bias by fixing the errors that base learners made by fitting one or more meta-models on the predictions made by base learners. [3,8].…”
Section: Introductionmentioning
confidence: 99%