Proceedings of the 1st Conference on Novel Gaze-Controlled Applications 2011
DOI: 10.1145/1983302.1983303
|View full text |Cite
|
Sign up to set email alerts
|

Designing gaze-supported multimodal interactions for the exploration of large image collections

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
45
0

Year Published

2013
2013
2018
2018

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 51 publications
(45 citation statements)
references
References 17 publications
0
45
0
Order By: Relevance
“…Stellmach et al evaluated techniques in several works that combine gaze and mobile input, i.e., inertial sensing and touch [22,20]. Their work developed techniques to navigate large image collections on public displays.…”
Section: Multi-modal Gaze Interaction With Public Displaysmentioning
confidence: 99%
See 2 more Smart Citations
“…Stellmach et al evaluated techniques in several works that combine gaze and mobile input, i.e., inertial sensing and touch [22,20]. Their work developed techniques to navigate large image collections on public displays.…”
Section: Multi-modal Gaze Interaction With Public Displaysmentioning
confidence: 99%
“…Their work developed techniques to navigate large image collections on public displays. Techniques used gaze for pointing while touch and accelerometer values were used to pan and zoom through images [22]. Users perceived increased effort and complexity when panning and zooming.…”
Section: Multi-modal Gaze Interaction With Public Displaysmentioning
confidence: 99%
See 1 more Smart Citation
“…Midas touch is problem connected with the use of users' gaze in humancomputer interaction. Solution by Stellmach et al [25] aims on overcoming the Midas Touch problem with connecting the gaze controlled interaction with the touch-and-tilt device to indicate desired reaction. This approach could be utilized in an ambient interface with gestures.…”
Section: Resultsmentioning
confidence: 99%
“…Stellmach et al investigated combinations of gaze with a wide variety of modalities [10], including a keyboard [23], tilt gestures [23][24], a mouse wheel [24], touch gestures [24][25] [26] and foot pedals [27]. A common interaction paradigm in gazebased interaction is that of gaze-supported interaction-gaze suggests and the other modality confirms [25].…”
Section: Gaze In Multimodal Interactionsmentioning
confidence: 99%