2005
DOI: 10.1016/j.patrec.2005.03.029
|View full text |Cite
|
Sign up to set email alerts
|

Feature combination using boosting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0
2

Year Published

2007
2007
2017
2017

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 54 publications
(25 citation statements)
references
References 14 publications
0
23
0
2
Order By: Relevance
“…The traditional boosting produces only one component weak classifier at each iteration. By contrast, at each round of our extension of the boosting procedures, several weak classifiers are trained on samples of each feature, and then combined into a single one (a middle final classifier) [38]:…”
Section: Boostingmentioning
confidence: 99%
See 1 more Smart Citation
“…The traditional boosting produces only one component weak classifier at each iteration. By contrast, at each round of our extension of the boosting procedures, several weak classifiers are trained on samples of each feature, and then combined into a single one (a middle final classifier) [38]:…”
Section: Boostingmentioning
confidence: 99%
“…Therefore, a second heuristic idea is to formulate it inside the exponent of the cost function: (38) Now the second order Taylor approximation we want to optimize is defined as follows:…”
Section: Importance Weighted Gentleboostmentioning
confidence: 99%
“…Multiple classifier combination has been intensively studied for many years [23][24][25][26]. It is shown that combination of multiple neural networks can achieve higher performance over the best individual one.…”
Section: Integrated Prediction For Exchange Ratementioning
confidence: 99%
“…Based on a given classifier set, the combination methods can be categorized according to the level of classifier outputs: abstract level (class label), rank level (rank order), and measurement level (class scores) [12,18]. Yin et al analyze multiple classifier systems from the perspective of feature combination [15]. And most popular systems are with multiple classifiers for character recognition [4,12,15].…”
Section: Introductionmentioning
confidence: 99%
“…Yin et al analyze multiple classifier systems from the perspective of feature combination [15]. And most popular systems are with multiple classifiers for character recognition [4,12,15].…”
Section: Introductionmentioning
confidence: 99%