Proceedings of the 3rd ACM Symposium on Spatial User Interaction 2015
DOI: 10.1145/2788940.2788949
|View full text |Cite
|
Sign up to set email alerts
|

Combining Direct and Indirect Touch Input for Interactive Workspaces using Gaze Input

Abstract: Interactive workspaces combine horizontal and vertical touch surfaces into a single digital workspace. During an exploration of these systems, it was shown that direct interaction on the vertical surface is cumbersome and more inaccurate than on the horizontal one. To overcome these problems, indirect touch systems turn the horizontal touch surface into an input devices that allows manipulation of objects on the vertical display. If the horizontal touch surface also acts as a display, however, it becomes neces… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(14 citation statements)
references
References 29 publications
0
14
0
Order By: Relevance
“…Pfeuffer and Voelker explored fusing touchscreen interaction with gaze controlled system by using eye gaze for object selection and touch interaction for object manipulation. Pfeuffer [22] explored desktop computing tasks like image searching and map navigation while Voelker [23] investigated multi-screen display, which is more advanced in terms of coordinate mapping between horizontal and vertical displays compared to Dostal's [11] system. However, our proposed work uses eye gaze for not only object selection but also for cursor movement.…”
Section: Gaze Controlled Interfacementioning
confidence: 99%
“…Pfeuffer and Voelker explored fusing touchscreen interaction with gaze controlled system by using eye gaze for object selection and touch interaction for object manipulation. Pfeuffer [22] explored desktop computing tasks like image searching and map navigation while Voelker [23] investigated multi-screen display, which is more advanced in terms of coordinate mapping between horizontal and vertical displays compared to Dostal's [11] system. However, our proposed work uses eye gaze for not only object selection but also for cursor movement.…”
Section: Gaze Controlled Interfacementioning
confidence: 99%
“…Pfeuffer and Voelker explored fusing touchscreen interaction with gaze controlled system by using eye gaze for object selection and touch interaction for object manipulation. Pfeuffer [2016] explored desktop computing tasks like image searching and map navigation while Voelker [2015] investigated multi-screen display, which is more advanced in terms of coordinate mapping between horizontal and vertical displays compared to Dostal's [2013] system. However, our proposed work uses eye gaze for not only object selection but also for cursor movement.…”
Section: Gaze Controlled Interfacementioning
confidence: 99%
“…Based on this, Voelker et al [30] compared different interaction techniques to enable a comfortable tracking state for indirect touch input in such setups and found lift-andtap to be the most promising. In the context of interactive workspace ergonomics (e.g., [31]), both Voelker et al [29] and Pfeuffer et al [24] introduced gaze-based mode switching between direct and indirect touch input. Further, Gilliot et al [8] explored the influence of input surface form factors on indirect target selection tasks with an absolute mapping and found that decreasing the input surface size improves target selection accuracy and that diverging aspect ratios between input and display areas decreases it.…”
Section: Indirect Touch Inputmentioning
confidence: 99%