2008 IEEE Intelligent Vehicles Symposium 2008
DOI: 10.1109/ivs.2008.4621221
|View full text |Cite
|
Sign up to set email alerts
|

On the importance of accurate weak classifier learning for boosted weak classifiers

Abstract: Recent work [1], has shown that improving model learning for weak classifiers can yield significant gains in the overall accuracy of a boosted classifier. However, most published classifier boosting research relies only on rudimentary learning techniques for weak classifiers. So while it is known that improving the model learning can greatly improve the accuracy of the resulting strong classifier, it remains to be shown how much can yet be gained by further improving the model learning at the weak classifier l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
8
0

Year Published

2009
2009
2013
2013

Publication Types

Select...
4
1

Relationship

3
2

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 23 publications
1
8
0
Order By: Relevance
“…It is not impossible because LUT classifiers with minefield partitions are prone to be selected as optimal classifiers on account of their possible small Z values according to (4). Similar observations are also presented in [22,23].…”
Section: Assessment Of Lut Based Strong Classifiersupporting
confidence: 55%
See 1 more Smart Citation
“…It is not impossible because LUT classifiers with minefield partitions are prone to be selected as optimal classifiers on account of their possible small Z values according to (4). Similar observations are also presented in [22,23].…”
Section: Assessment Of Lut Based Strong Classifiersupporting
confidence: 55%
“…Meanwhile, the strong classifiers based on smoothed LUT classifiers have a little worse convergent ability than those based on unsmooth ones. Some smoothing schemes for LUT classifiers have been proposed by the researchers, such as smoothing response binning (SRB for short) [18] method and least sigmoid regression [23] (LSR for short) method. The former aims at weight sum smoothing method with adaptive smoothing region and almost all-one smoothing factor, which needs to be extended to more general form for both the smoothing factor and the smoothing width.…”
Section: Smoothed Lut Classifiermentioning
confidence: 99%
“…These are both fast and discriminative [17] [18]. However, the benefit of improved modelling while keeping the fast lookup table approach has been shown in [19] and [14]. For Haar features this modelling approach yielded a 75% average error reduction [19].…”
Section: ) Finding a 1d Projectionmentioning
confidence: 99%
“…However, the benefit of improved modelling while keeping the fast lookup table approach has been shown in [19] and [14]. For Haar features this modelling approach yielded a 75% average error reduction [19]. Therefore we apply a similar Smoothed Response Binning Method to our scalar LiteHOG and LiteHOG+ feature responses.…”
Section: ) Finding a 1d Projectionmentioning
confidence: 99%
See 1 more Smart Citation