2019
DOI: 10.1109/access.2019.2956953
|View full text |Cite
|
Sign up to set email alerts
|

Focus on Area Tracking Based on Deep Learning for Multiple Users

Abstract: Most eye-tracking experiments are limited to single subjects because gaze points are difficult to track when multiple users are involved and environmental factors might cause interference. To overcome this problem, this paper proposes a method for gaze tracking that can be applied for multiple users simultaneously. Four models, including FASEM, FAEM and FAFRCM in the signal-user environment, as well as FAEM and FAMAM in the multiple-user environment, are proposed, and we collected raw data of gazing behaviors … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 20 publications
(24 reference statements)
0
1
0
Order By: Relevance
“…In recent years, due to the development of machine learning technology, many studies have used deep learning training methods to improve the accuracy of gaze point estimation [4,5]. Since the Convolutional Neural Network (CNN) excels in many computer vision tasks, it is also used for gaze point estimation [2,6,7]. For CNN-based gaze point estimation, the image is selected as the training and predictive data with the extracted spatial features.…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, due to the development of machine learning technology, many studies have used deep learning training methods to improve the accuracy of gaze point estimation [4,5]. Since the Convolutional Neural Network (CNN) excels in many computer vision tasks, it is also used for gaze point estimation [2,6,7]. For CNN-based gaze point estimation, the image is selected as the training and predictive data with the extracted spatial features.…”
Section: Introductionmentioning
confidence: 99%