2017
DOI: 10.1007/s13218-017-0503-y
|View full text |Cite
|
Sign up to set email alerts
|

Automated interpretation of eye–hand coordination in mobile eye tracking recordings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 14 publications
(14 citation statements)
references
References 20 publications
0
14
0
Order By: Relevance
“…In humancomputer interaction, for instance, stationary eye-tracking systems are well established to investigate usability aspects of software tools or websites [53]. Usability testing of portable products, however, requires the use of mobile eye-tracking glasses, which allow the participant to move freely without restrictions while interacting with the product [48,49]. A camera integrated in the glasses records the participant's field of view ('scene video'), while two additional cameras located in the lower frame of the glasses record eye movement.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…In humancomputer interaction, for instance, stationary eye-tracking systems are well established to investigate usability aspects of software tools or websites [53]. Usability testing of portable products, however, requires the use of mobile eye-tracking glasses, which allow the participant to move freely without restrictions while interacting with the product [48,49]. A camera integrated in the glasses records the participant's field of view ('scene video'), while two additional cameras located in the lower frame of the glasses record eye movement.…”
Section: Methodsmentioning
confidence: 99%
“…Eye-tracking data showed how long a participant's gaze focused on a given user interface element; that is, the dwell time on an AOI. Whereas long dwell times on an AOI reflect intensive cognitive efforts, short dwell times correspond to simple interface elements and thus to better product ease of use [49,66]. The user interface of the connected selfinjection system was dissected in distinct user interface elements/AOIs as depicted in Figure 3.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Closely linked is the work by [6], which focusses on the automatic interpretation of specific gestural instances involving eye-hand co-ordination in the context of mobile eyetracking. To quote [6]: this work aims to automatically detect cognitive demanding phases in mobile eye tracking recordings.…”
Section: Contentmentioning
confidence: 99%
“…To quote [6]: this work aims to automatically detect cognitive demanding phases in mobile eye tracking recordings. The approach presented combines the user's perception (gaze) and action (hand) to isolate demanding interactions based upon a multi-modal feature level fusion.…”
Section: Contentmentioning
confidence: 99%