Proceedings of 13th International Conference on Pattern Recognition 1996
DOI: 10.1109/icpr.1996.547205
|View full text |Cite
|
Sign up to set email alerts
|

Combining classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

10
846
0
19

Year Published

1999
1999
2017
2017

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 672 publications
(875 citation statements)
references
References 13 publications
10
846
0
19
Order By: Relevance
“…Given the posterior estimates from the individual classifiers, the goal of the combining stage is to produce a single estimate that maximizes the probability for localized object detection while reducing clutter and false alarms. Various integration methods have been proposed in the past [5]. We formulate a supra-Bayesian integration in which the posterior estimates from each classifier are assumed to have a probability distribution and, based on the means and variances of the outputs, we can formulate an optimal decision scheme.…”
Section: Bayesian Classificationmentioning
confidence: 99%
“…Given the posterior estimates from the individual classifiers, the goal of the combining stage is to produce a single estimate that maximizes the probability for localized object detection while reducing clutter and false alarms. Various integration methods have been proposed in the past [5]. We formulate a supra-Bayesian integration in which the posterior estimates from each classifier are assumed to have a probability distribution and, based on the means and variances of the outputs, we can formulate an optimal decision scheme.…”
Section: Bayesian Classificationmentioning
confidence: 99%
“…. .,Xm) Pi(P(CilXi),P(CiX2),...,P(CiIXm)) (9) Considering that Pi has to obey the laws of probability if an integrated measure of the posterior is to be estimated, the integration may be written as:…”
Section: Multifeature Integrationmentioning
confidence: 99%
“…Various integration methods have been proposed in the past. 9 We formulate a supra-Bayesian integration in which the posterior estimates from each classifier are assumed to have a probability distribution and, based on the means and variances of the outputs, we can formulate an optimal decision scheme. Strictly speaking, Bayesian theory holds true only for individual decision makers, but if the group decision is viewed as a collaborative effort, the effect is externally Bayesian.…”
Section: Multifeature Integrationmentioning
confidence: 99%
“…As the term ½ does not depend on , we may neglect it for classification purposes. Note that this motivation for the sum rule differs from that proposed by KITTLER in [8]. Using multiple classifiers to classify a single test pattern, he assumed that the a-posteriori probabilities computed by the respective classifiers do not differ much from the a-priori probabilities to justify the sum rule.…”
Section: The Virtual Test Sample Methodsmentioning
confidence: 99%
“…Many authors such as SCHWENK use TD within artificial neural nets [7]. Finally, the virtual test sample method proposed in Section 3 was motivated by KITTLER's research on classifier combination [8].…”
Section: Related Workmentioning
confidence: 99%