2018
DOI: 10.48550/arxiv.1810.03286
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Guiding Intelligent Surveillance System by learning-by-synthesis gaze estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
1
1

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 25 publications
0
2
0
Order By: Relevance
“…First method is a simple cascaded method [8][32] [30] which uses multiple k-NN(k-Nearest Neighbor) classifier to select neighbors in feature space joint head pose,pupil center and eye appearance. The other method is to train a simple convolutional neural network (CNN) [11][29] [31] to predict the eye gaze direction with l 2 loss. We train on UnityEyes ,UTView, SynthesEyes and test on MPIIGaze, purified MPIIGaze which is purified by proposed method.…”
Section: Appearance-based Gaze Estimationmentioning
confidence: 99%
“…First method is a simple cascaded method [8][32] [30] which uses multiple k-NN(k-Nearest Neighbor) classifier to select neighbors in feature space joint head pose,pupil center and eye appearance. The other method is to train a simple convolutional neural network (CNN) [11][29] [31] to predict the eye gaze direction with l 2 loss. We train on UnityEyes ,UTView, SynthesEyes and test on MPIIGaze, purified MPIIGaze which is purified by proposed method.…”
Section: Appearance-based Gaze Estimationmentioning
confidence: 99%
“…First method is a simple cascaded method [9][37] [35] which uses multiple k-NN(k-Nearest Neighbor) classifier to select neighbors in feature space joint head pose,pupil center and eye appearance. The other method is to train a simple convolutional neural network (CNN) [12][34] [36] to predict the eye gaze direction with l 2 loss. We train on UnityEyes ,UTView, SynthesEyes and test on MPIIGaze, purified MPIIGaze which is purified by proposed method.…”
Section: Appearance-based Gaze Estimationmentioning
confidence: 99%