Proceedings of the 2013 International Conference on Intelligent User Interfaces 2013
DOI: 10.1145/2449396.2449416
|View full text |Cite
|
Sign up to set email alerts
|

Subtle gaze-dependent techniques for visualising display changes in multi-display environments

Abstract: This paper explores techniques for visualising display changes in multi-display environments. We present four subtle gaze-dependent techniques for visualising change on unattended displays called FreezeFrame, PixMap, WindowMap and Aura. To enable the techniques to be directly deployed to workstations, we also present a system that automatically identifies the user's eyes using computer vision and a set of web cameras mounted on the displays. An evaluation confirms this system can detect which display the user … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
37
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 39 publications
(37 citation statements)
references
References 20 publications
0
37
0
Order By: Relevance
“…Research on developing eye trackers investigates reducing the cost of existing infrared-based trackers (e.g., Seeing Machines, n.d.; Tobii, n.d.) as well as increasing their accuracy. Researchers also worked on developing customized eye trackers for tasks that do not require precise x and y coordinates as input from the tracker (Dostal, Kristensson, & Quigley, 2013;Zhang, Bulling, & Gellersen, 2013). On a different set of applications, eye trackers often help individuals to design better billboards, traffic signs, and advertising posters through analysis of users' eye-gaze patterns (Donegan et al, 2009;Duchowski, 2007;Majaranta & Raiha, 2002).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Research on developing eye trackers investigates reducing the cost of existing infrared-based trackers (e.g., Seeing Machines, n.d.; Tobii, n.d.) as well as increasing their accuracy. Researchers also worked on developing customized eye trackers for tasks that do not require precise x and y coordinates as input from the tracker (Dostal, Kristensson, & Quigley, 2013;Zhang, Bulling, & Gellersen, 2013). On a different set of applications, eye trackers often help individuals to design better billboards, traffic signs, and advertising posters through analysis of users' eye-gaze patterns (Donegan et al, 2009;Duchowski, 2007;Majaranta & Raiha, 2002).…”
Section: Introductionmentioning
confidence: 99%
“…They used head tracking to switch pointer across screens, which was preferred by participants, but in effect increased pointing time. Dostal et al (2013) addressed similar issues by detecting which monitor the user is looking at through analyzing webcam video. The Sideways system (Zhang et al, 2013) does not need personalized calibration and can scroll contents of a display screen by detecting eye gaze.…”
Section: Introductionmentioning
confidence: 99%
“…They used head tracking to switch pointers across screens, which was preferred by participants, but in effect increased pointing time. Dostal and colleagues [11] addressed similar issues by detecting which monitor the user is looking at through analyzing webcam video. The Sideways system [12] even eliminates personalized calibration and can scroll contents of a display screen by detecting eye gaze.…”
Section: Gaze Controlled Interfacementioning
confidence: 99%
“…Pfeuffer and Voelker explored fusing touchscreen interaction with gaze controlled system by using eye gaze for object selection and touch interaction for object manipulation. Pfeuffer [22] explored desktop computing tasks like image searching and map navigation while Voelker [23] investigated multi-screen display, which is more advanced in terms of coordinate mapping between horizontal and vertical displays compared to Dostal's [11] system. However, our proposed work uses eye gaze for not only object selection but also for cursor movement.…”
Section: Gaze Controlled Interfacementioning
confidence: 99%
“…Pfeuffer and Voelker explored fusing touchscreen interaction with gaze controlled system by using eye gaze for object selection and touch interaction for object manipulation. Pfeuffer [2016] explored desktop computing tasks like image searching and map navigation while Voelker [2015] investigated multi-screen display, which is more advanced in terms of coordinate mapping between horizontal and vertical displays compared to Dostal's [2013] system. However, our proposed work uses eye gaze for not only object selection but also for cursor movement.…”
Section: Gaze Controlled Interfacementioning
confidence: 99%