1997
DOI: 10.1016/s0952-1976(97)00020-1
|View full text |Cite
|
Sign up to set email alerts
|

A framework for probabilistic combination of multiple classifiers at an abstract level

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2003
2003
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(9 citation statements)
references
References 10 publications
0
9
0
Order By: Relevance
“…6a illustrates a "hole" made of one background pixel. In general, the foreground pixels enumerated 14,15,16,18,19,21,22, and 23 are its 8-neighbors and pixels 15, 18, 19, and 22 are its 4-neighbors. Therefore, the last subset would be the thin representation of the loop-see Fig.…”
Section: Loop Extractionmentioning
confidence: 99%
See 1 more Smart Citation
“…6a illustrates a "hole" made of one background pixel. In general, the foreground pixels enumerated 14,15,16,18,19,21,22, and 23 are its 8-neighbors and pixels 15, 18, 19, and 22 are its 4-neighbors. Therefore, the last subset would be the thin representation of the loop-see Fig.…”
Section: Loop Extractionmentioning
confidence: 99%
“…Some of them proposed embedding offline-like data in the classical stroke representation during preprocessing ( [11], [12], [13], [14]). Other methods suggested integration with pure offline word recognition systems at the postprocessing stage ( [15], [16], [17], [18]). We propose using the same online word recognition engine to process both types of data.…”
Section: Introductionmentioning
confidence: 99%
“…In addition, many classifiers such as ANNs [89], hidden Markov models (HMMs) [94], and SVMs [95] can be used to recognize characters after feature extraction. Some researchers integrate two kinds of classification schemes [126], [127], use multistage classification schemes [128], or a "parallel" combination of multiple classifiers [129], [130]. d) Discussion: Although significant progress of LPR techniques has been made in the last few decades and commercial products exist, there is still plenty of work to be done.…”
Section: Vehicle Recognitionmentioning
confidence: 99%
“…The most often used classifiers fusion approaches include the majority voting (Xu et al, 1992); the weighted combination (weighted averaging) (Kuncheva, 2004); the probabilistic schemes (Kittler et al, 1997;Kittler et al, 1998) ; various rank-ordered rules, such as the Borda count (Ho et al, 1994;E. Kim et al, 2002); the sum rule (averaging), product-rule, max-rule, minrule, median rule (Kittler et al, 1998); the Bayesian approach (naïve Bayes combination) (Altincay, 2005;Kuncheva, 2004;Xu et al,1992); the Dempster-Shafer (D-S) theory of evidence (Denoeux, 1995;Xu et al, 1992); the behavior-knowledge space method (BKS) (Huang & Suen, 1995;Shipp & Kuncheva, 2002); the fuzzy integral (Chi et al, 1996;Kuncheva, 2004;Mirhosseini et al, 1998); fuzzy templates (Kuncheva et al, 1998); decision templates (Kuncheva, 2001(Kuncheva, , 2004; combination through order statistics (Kang et al, 1997a(Kang et al, , 1997b; combination by a neural network (Ceccareli & Petrosino, 1997). In a recent review paper (Oza & Tumer, 2008) a summary of the leading ensemble methods and a discussion of their application to four broad classes of real-world classification problems is provided.…”
Section: Introductionmentioning
confidence: 99%