Proceedings of the 11th ACM Symposium on Eye Tracking Research &Amp; Applications 2019
DOI: 10.1145/3314111.3319843
|View full text |Cite
|
Sign up to set email alerts
|

Characterizing joint attention behavior during real world interactions using automated object and gaze detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
1
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 12 publications
0
1
0
Order By: Relevance
“…In a driving situation, realtime gaze coding allows to track a driver's attention and detecting whether they are distracted or fatigue [106,255], and support automated driving [170]. Gaze can also help diagnose mental health or autism by analysing the scan path [236]. Moving beyond, the gaze is also presented as a human-computer interface, facilitating controlling Internet of Things (IoT) devices in a smart home system [127], and empowering people with a physical impairment to interact with applications such as creative art [49,140,263].…”
Section: Introductionmentioning
confidence: 99%
“…In a driving situation, realtime gaze coding allows to track a driver's attention and detecting whether they are distracted or fatigue [106,255], and support automated driving [170]. Gaze can also help diagnose mental health or autism by analysing the scan path [236]. Moving beyond, the gaze is also presented as a human-computer interface, facilitating controlling Internet of Things (IoT) devices in a smart home system [127], and empowering people with a physical impairment to interact with applications such as creative art [49,140,263].…”
Section: Introductionmentioning
confidence: 99%
“…In contrast to manual qualitative evaluation, merits of automated analysis include an unbiased judgement of the resultant gaze, enhanced by metrics to allow further prediction based on the level of attention. Frameworks for analysis of this calibre have been applied in a variety of domains ranging from social behaviour of adults [8,20,21] and infants [13], classroom environments [4], [3] to human-robot interaction [17] and industrial environments [6].…”
Section: Introductionmentioning
confidence: 99%
“…Lemaignan et al [12] studied one's focus time on a target, determined by a broad field of attention estimated by the head pose, limiting the accurate alignment of the gaze to a small dynamic target. A study of joint attention [20] estimates the latency between the instruction and the resulting look, based on the presence of the gaze dot within the expected field of attention, whilst the latency, longest and shortest look for each object of interest has been previously assessed [21]. However, these work [20,21] rely on specialised sensors to capture the gaze and object position, as opposed to a single video feed.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Public perception of these devices is overwhelmingly negative, as seen with the initial release of the Google Glass, as they infringe on the privacy of both the user and bystanders [20,59,70]. Daily users of eye tracking technology trade-off the privacy of their everyday actions for the benefit of activity logging, gaze-based interfaces, and assistive applications [7,37,55,81]. Steil et al have developed a privacy approach specifically for the scene camera, using a controlled shutter to disable the video feed in private situations [75].…”
Section: Introductionmentioning
confidence: 99%