The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
Classic Works of the Dempster-Shafer Theory of Belief Functions
DOI: 10.1007/978-3-540-44792-4_27
|View full text |Cite
|
Sign up to set email alerts
|

Combining the Results of Several Neural Network Classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
81
0
2

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 55 publications
(83 citation statements)
references
References 4 publications
0
81
0
2
Order By: Relevance
“…Further, the results of our method is compared with two evidence-based combining classifier methods proposed by Rogova [33] and Tabassian, et. al.…”
Section: Methodsmentioning
confidence: 99%
“…Further, the results of our method is compared with two evidence-based combining classifier methods proposed by Rogova [33] and Tabassian, et. al.…”
Section: Methodsmentioning
confidence: 99%
“…We will present a framework in which the combination method is updated on-line in a sample-wise, strictly incremental manner. On-line extensions of various well-known batch ensemble (classifier fusion) methods are presented as the combination method, such as Fuzzy Integral [17,4], Decision Templates [26,30] and ensembles based on Dempster-Shafer theory [48,53]. In this sense, it can be seen as an extension of the work in [33], where Naive Bayes [69] and BKS [21] were used as combination rule.…”
Section: Sannen Et Al / Towards Incremental Classifier Fusionmentioning
confidence: 99%
“…Batch Training In [48] a way to apply the Dempster-Shafer theory of evidence to the problem of classifier fusion is described. The Dempster-Shafer combination training is like the Decision Templates training: the c decision templates are calculated from the data set in the same way -see Eq.…”
Section: Incremental Dempster-shafer Combination Ensemblementioning
confidence: 99%
“…Xu et al, 66 Ho 32 Decision Different principals Giacinto et al 22 Giacinto and Roli 20 Giacinto et al 22 Woods et al 65 Ho et al 30 Different parameters Ng and Singh 45 Giacinto and Roli 20 Giacinto et al 22 Cao et al 7 or initializations Rogova 48 Cao et al 7 Sharkey et al 54 "Multiple features and Cho and Kim 9 "Test and Select" multistage classifiers" Representation Random selection Wolpert 64 Giacinto and Roli 19 "Stacked generalization" "Adaptive selection" His current research interests include statistical pattern recognition, speech recognition, image processing, data fusion and uncertainty modeling in statistical inference.…”
Section: Featuresmentioning
confidence: 99%