The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
1994
DOI: 10.1016/0893-6080(94)90099-x
|View full text |Cite
|
Sign up to set email alerts
|

Combining the results of several neural network classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
76
2
1

Year Published

2002
2002
2018
2018

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 324 publications
(79 citation statements)
references
References 3 publications
0
76
2
1
Order By: Relevance
“…Various authors have investigated the use of Dempster-Shafer theory for combining the results of different classifiers [2,6,10,15]. However, the aim of using Dempster-Shafer theory in this context is quite different from our aim in this paper.…”
Section: Related Workcontrasting
confidence: 45%
“…Various authors have investigated the use of Dempster-Shafer theory for combining the results of different classifiers [2,6,10,15]. However, the aim of using Dempster-Shafer theory in this context is quite different from our aim in this paper.…”
Section: Related Workcontrasting
confidence: 45%
“…The effectiveness of an ensemble can be measured by the extent to which the members are error-independent (show different patterns of generalization) [19]. The ideal would be a set of models where each of the models generalize well, and when they do make errors on new data, these errors are not shared with any other models [19].…”
Section: Ensemble Modelingmentioning
confidence: 99%
“…The ideal would be a set of models where each of the models generalize well, and when they do make errors on new data, these errors are not shared with any other models [19].…”
Section: Ensemble Modelingmentioning
confidence: 99%
“…Where we interpret the classifier outputs as the support for the classes, fuzzy aggregation methods can be applied, such as simple connectives between fuzzy sets or the fuzzy integral [23,22,66,128]; if the classifier outputs are possibilistic, Dempster-Schafer combination rules can be applied [108]. Statistical methods and similarity measures to estimate classifier correlation have also been used to evaluate expert system combination for a proper design of multi-expert systems [58].…”
Section: Non-generative Ensemblesmentioning
confidence: 99%