2007 IEEE 11th International Conference on Computer Vision 2007
DOI: 10.1109/iccv.2007.4409038
|View full text |Cite
|
Sign up to set email alerts
|

Fast training and selection of Haar features using statistics in boosting-based face detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
84
0
1

Year Published

2008
2008
2020
2020

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 113 publications
(85 citation statements)
references
References 5 publications
0
84
0
1
Order By: Relevance
“…Our goal in this paper is not to provide an integrated state-of-the-art face detector system, but rather provide a feature selection tool that can be combined with more advanced boosting methods, like GentleBoost [2], Real Adaboost [12], and Vector Boosting [3] order to achieve state-of-the-art results. In a more interesting way, our method could be integrated into the learning method recently proposed by Pham and Cham [10], which achieves extremely fast learning time compared to previous methods. We believe this method would have an even reduced computational learning time by using locally adapted features.…”
Section: Discussionmentioning
confidence: 99%
“…Our goal in this paper is not to provide an integrated state-of-the-art face detector system, but rather provide a feature selection tool that can be combined with more advanced boosting methods, like GentleBoost [2], Real Adaboost [12], and Vector Boosting [3] order to achieve state-of-the-art results. In a more interesting way, our method could be integrated into the learning method recently proposed by Pham and Cham [10], which achieves extremely fast learning time compared to previous methods. We believe this method would have an even reduced computational learning time by using locally adapted features.…”
Section: Discussionmentioning
confidence: 99%
“…and [125] reported experimental results demonstrating better ROC curve performance than the traditional AdaBoost approach, though it appears unlikely that they can also outperform the state-of-the-art detectors such as [61,106]. Various efforts have also been made to improve the detector's test speed.…”
Section: Variations Of the Boosting Learning Algorithmmentioning
confidence: 99%
“…Weighting of the weak classifiers can be conducted after the feature selection step. In [125] another fast method to train and select Haar features was presented. The method treated the training examples as high dimensional random vectors, and kept the first and second order statistics to build classifiers from features.…”
Section: Variations Of the Boosting Learning Algorithmmentioning
confidence: 99%
“…Nevertheless, the performance improvement in their system is limited, since only simple Haar-like features are considered. Several works [10,11] proposed to explore more homogeneous feature types in order to improve detector's performance. However, expansion of feature numbers and types automatically increases the size of feature set and storage memory.…”
Section: Related Workmentioning
confidence: 99%
“…In our approach, we adopt a total of nine generalized Haar-like features, including a group of extended Haar-like features proposed in [11] and four basic Haar-like features, to increase the detector's performance. …”
Section: Robustness Of Haar-like Featuresmentioning
confidence: 99%