2021
DOI: 10.1016/j.cag.2021.01.001
|View full text |Cite
|
Sign up to set email alerts
|

ARtention: A design space for gaze-adaptive user interfaces in augmented reality

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 66 publications
(16 citation statements)
references
References 39 publications
0
14
0
Order By: Relevance
“…Gaze corresponds to the user's focus of attention which makes it natural and fast for pointing at objects of interest [4]. A wide range of work has harnessed gaze for implicit interaction, for example, to render interfaces attentive to the user [49] and adapt information displays [29]. Gaze has also been adopted for explicit input, based on interaction techniques that extend gaze pointing with a selection method equivalent to a mouse "click".…”
Section: Pointing In Gaze and Mid-air Interfacesmentioning
confidence: 99%
“…Gaze corresponds to the user's focus of attention which makes it natural and fast for pointing at objects of interest [4]. A wide range of work has harnessed gaze for implicit interaction, for example, to render interfaces attentive to the user [49] and adapt information displays [29]. Gaze has also been adopted for explicit input, based on interaction techniques that extend gaze pointing with a selection method equivalent to a mouse "click".…”
Section: Pointing In Gaze and Mid-air Interfacesmentioning
confidence: 99%
“…This app introduces the concept of design space for gaze-input-based interfaces for AR environments [10]. It is a form of input that interacts with the world based on where the user is looking.…”
Section: Artentionmentioning
confidence: 99%
“…Pham and Stuerzlinger propose comparing the different controllers, focusing on the flexibility in using the 3D pen for VR and AR applications [105]. Electronic skins have recently been investigated as one of the most promising device solutions for future VR/AR devices [106], while the ARtention project [107] illustrates how input information can come directly from retinal motion. Finally, the MARVIS project proposes integrating HMDs and mobile devices for optimized data management in augmented reality [108].…”
Section: Network Connectivitymentioning
confidence: 99%
“…They do not detect user input but define an active perception of the system towards specific signals appropriately encoded and read by the system. Standard sensors include motion detection [135], gaze tracking [107], and speech recognition for visual content [136]. Most of these sensors have little presence in the AR world.…”
Section: Sensor-based Interfacesmentioning
confidence: 99%