1999
DOI: 10.1016/s0920-5489(99)90850-4
|View full text |Cite
|
Sign up to set email alerts
|

Recognition of human head orientation based on artificial neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
22
0

Year Published

1999
1999
2014
2014

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 17 publications
(23 citation statements)
references
References 0 publications
1
22
0
Order By: Relevance
“…al. [6]. As compared to their results, we did not observe serious degradation on data from new users.…”
Section: Training and Resultssupporting
confidence: 53%
See 2 more Smart Citations
“…al. [6]. As compared to their results, we did not observe serious degradation on data from new users.…”
Section: Training and Resultssupporting
confidence: 53%
“…al. [6] describe a user dependent neural network based system to estimate pan and tilt of a person. In their approach, color segmentation, ellipse fitting and Gabor-filtering on a segmented face are used for preprocessing.…”
Section: Estimating Head Pose With Neural Netsmentioning
confidence: 99%
See 1 more Smart Citation
“…Model-based approaches typically recover the face pose by establishing the relationship between 3D face model and its twodimensional (2D) projection [26][27][28]. Appearance-based approaches are based on view interpolation and their goal is to construct an association between appearance and face orientation [29][30][31]. Although appearancebased methods are simpler, they are expected to be less accurate than model-based approaches and are mainly used for pose discrimination.…”
Section: Face (Head) Orientation Estimationmentioning
confidence: 99%
“…Of the numerous techniques proposed for gaze estimation [31][32][33], the one proposed by Ebisawa [34] appears very promising and is directly applicable to this project. Their technique estimates the gaze direction based on the relative position between pupil and the glint.…”
Section: Eye-gaze Determination and Trackingmentioning
confidence: 99%