2003
DOI: 10.1016/s0031-3203(03)00191-2
|View full text |Cite
|
Sign up to set email alerts
|

Vote counting measures for ensemble classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
27
0

Year Published

2004
2004
2019
2019

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 30 publications
(28 citation statements)
references
References 33 publications
1
27
0
Order By: Relevance
“…In [13] first and second order Walsh coefficients were used for ensemble pruning. The motivation for using Walsh coefficients in ensemble design is fully explored in [6] and [18]. For further understanding of the meaning and applications of Walsh coefficients see [19].…”
Section: Introductionmentioning
confidence: 99%
“…In [13] first and second order Walsh coefficients were used for ensemble pruning. The motivation for using Walsh coefficients in ensemble design is fully explored in [6] and [18]. For further understanding of the meaning and applications of Walsh coefficients see [19].…”
Section: Introductionmentioning
confidence: 99%
“…This mapping may be analysed using Walsh spectral coefficients. First order Walsh coefficients were shown to provide a measure of class separability for selecting optimal base classifiers in [2], in which it is also shown that this does not imply optimality of the ensemble. In contrast, in [3] it was shown that second order Walsh coefficients may be used to determine optimal ensemble performance.…”
Section: Introductionmentioning
confidence: 99%
“…In contrast, in [3] it was shown that second order Walsh coefficients may be used to determine optimal ensemble performance. The motivation for using Walsh coefficients in ensemble design is fully explored in [4] and [2]. For further understanding of the meaning and applications of Walsh coefficients see [5] and [6].…”
Section: Introductionmentioning
confidence: 99%
“…The problem with applying classical class separability measures to the binary mapping associated with MCS (equation ( (1))) is that the implicit Gaussian assumption is not appropriate [11]. In [4], [12] a class separability measure is proposed for MCS that is based on a binary feature representation, in which each pattern is represented by its binary ensemble classifier decisions. It is restricted to two-class problems and results in a binary-to-binary mapping.…”
Section: Diversity/accuracy and Mcsmentioning
confidence: 99%
“…The measure is based on a spectral representation that was first proposed for two-class problems in [2], and later developed in the context of Multiple Classifier Systems in [3]. It was shown for two-class problems in [4] that over-fitting of the training set could be detected by observing the separability measure as it varies with base classifier complexity. Since realistic learning problems are in general ill-posed [5], it is known that any attempt to automate the learning task must make some assumptions.…”
Section: Introductionmentioning
confidence: 99%