Pattern Recognition 2001
DOI: 10.1142/9789812386533_0015
|View full text |Cite
|
Sign up to set email alerts
|

Combining Classifiers: Soft Computing Solutions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
55
0
2

Year Published

2005
2005
2014
2014

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 94 publications
(58 citation statements)
references
References 58 publications
1
55
0
2
Order By: Relevance
“…They propose to change (i) starting point in the hypothesis space, (ii) training data, (iii) classifier architectures, and (iv) traversal of the hypothesis space. In an early work, Kuncheva [20] proposes four different methods to build a multiclassifier system: (i) using different combiners when already trained base classifiers are given, (ii) using different algorithms and starting parameters, (iii) using different feature subsets, and (iv) using different training sets.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…They propose to change (i) starting point in the hypothesis space, (ii) training data, (iii) classifier architectures, and (iv) traversal of the hypothesis space. In an early work, Kuncheva [20] proposes four different methods to build a multiclassifier system: (i) using different combiners when already trained base classifiers are given, (ii) using different algorithms and starting parameters, (iii) using different feature subsets, and (iv) using different training sets.…”
Section: Related Workmentioning
confidence: 99%
“…BAG: The original Bagging algorithm proposed in [7]. We train decision tree ensembles of size 5,10,15,20,25,30 and choose the one with the best val-B accuracy.…”
Section: Comparison With Adaboost and Baggingmentioning
confidence: 99%
“…Alternatively, the combination of classifier outputs can be performed on an entire decision profile or the selected information to constrain a class decision. We refer to this alternative group of methods as class-indifferent methods [12]. In our work, the concept of the class-indifferent methods is slightly different from the ones aforementioned in [1], [12] and [23] .…”
Section: Modelling Classifier Outputs and Combination Methodsmentioning
confidence: 99%
“…The design of a method for combining classifier decisions is a challenging task in constructing an effective ensemble and various methods have been developed in the past decades. Kuncheva in [12] roughly characterizes combination methods, based on the forms of classifier outputs, into two categories. In the first category, the combination of decisions is performed on single classes, such as majority voting and Bayesian probability, which have extensively been examined in the ensemble literature [9], [11] and [23].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation