Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems 2020
DOI: 10.1145/3313831.3376592
|View full text |Cite
|
Sign up to set email alerts
|

Understanding Viewport- and World-based Pointing with Everyday Smart Devices in Immersive Augmented Reality

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2
2

Relationship

0
6

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 38 publications
0
4
0
Order By: Relevance
“…Gaze in turn is free to point anywhere within an HMD display but not able to point beyond. Prior work on AR pointing found techniques that are reliant on a cursor that moves with the viewport to be less performant than techniques that are decoupled from the viewport [10]. This contrasts with the finding of Cao et al where Fig.…”
Section: Pointing At Dynamically Revealed Targetsmentioning
confidence: 66%
“…Gaze in turn is free to point anywhere within an HMD display but not able to point beyond. Prior work on AR pointing found techniques that are reliant on a cursor that moves with the viewport to be less performant than techniques that are decoupled from the viewport [10]. This contrasts with the finding of Cao et al where Fig.…”
Section: Pointing At Dynamically Revealed Targetsmentioning
confidence: 66%
“…Concerning XR S spaces, an additional criterion comes into play: the interaction technique should scale between HMDs and HHDs, i.e., the technique should remain intuitive to the users even when they switch devices. However, previous research has mostly focused individually on the various interaction modalities offered by XR technologies including in-air [45][46][47][48][49][50][51][52][53][54], touchbased [55][56][57][58][59], tangible [60][61][62], head-, gaze-, or speech-based [45,55,58,[63][64][65], and multimodal [55,58,64] input techniques to select and manipulate virtual objects as well as to navigate in space. Many of these approaches require tracking parts of the users' bodies or external interaction devices.…”
Section: Intuitive Interaction Techniquesmentioning
confidence: 99%
“…In these cases, inertial tracking is considered beneficial. Previous work (e.g., [56,58]) tracked the position and orientation of external input devices in space using device-incorporated IMU sensors-an inertial measurement unit that usually consists of accelerometers, gyroscopes, and magnetometers. Such approaches do not limit interaction to a particular part in space.…”
Section: Intuitive Interaction Techniquesmentioning
confidence: 99%
See 1 more Smart Citation
“…the target). Precise Head Point in most HMDs is viewport-based [8]: a cursor is placed in the center of the FoV and remains in the center when users move their head.…”
Section: Precise Head Point (Php)mentioning
confidence: 99%