22nd International Conference on Human-Computer Interaction With Mobile Devices and Services 2020
DOI: 10.1145/3379503.3403548
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating Menu Techniques for Handheld AR with a Smartphone & Mid-Air Pen

Abstract: Figure 1: Handheld AR systems with a mid-air pointing device let users interact in Augmented Reality requiring only a smartphone and a custom printed pen (left). We implemented and evaluated menu techniques that use different methods for the interaction with the system, for example, using the mid-air pen (middle) or the hand holding the smartphone (right).

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 33 publications
0
3
0
Order By: Relevance
“…Being able to track all these input modalities without any external devices by only using an off-the-shelf smartphone would enable many powerful spatial interactions, discovered over decades of in-lab research, for everyone (e.g., head or gaze pointing, virtual-hand or peephole interactions, body-centered inputs). First steps were already made for using smartphone-based world-tracking in the domain of distant displays [2], handheld AR [49][50][51], head-mounted displays [52] as well as using face-tracking in the domain of cross-device [53] and on-phone interactions [54]. Using simultaneous world-and face-tracking on off-the-shelf smartphones, however, still remains unaddressed, since just recently the first examples of the technology were featured for handheld AR use-cases [55].…”
Section: Discussionmentioning
confidence: 99%
“…Being able to track all these input modalities without any external devices by only using an off-the-shelf smartphone would enable many powerful spatial interactions, discovered over decades of in-lab research, for everyone (e.g., head or gaze pointing, virtual-hand or peephole interactions, body-centered inputs). First steps were already made for using smartphone-based world-tracking in the domain of distant displays [2], handheld AR [49][50][51], head-mounted displays [52] as well as using face-tracking in the domain of cross-device [53] and on-phone interactions [54]. Using simultaneous world-and face-tracking on off-the-shelf smartphones, however, still remains unaddressed, since just recently the first examples of the technology were featured for handheld AR use-cases [55].…”
Section: Discussionmentioning
confidence: 99%
“…Controllers integrate buttons, trackpads, and/or joysticks. Other types of controllers exist, including pens [51]. Myopoint [18] uses the forearm, equiped with a Myo band to measure electromyographic and 3D orientation (using IMUs), to point on a large distant screen.…”
Section: D Selectionmentioning
confidence: 99%
“…Arpen [7] represents a series of studies with a similar approach to a HCI. The pen is designed with a cube shape, and the study is limited to six markers, with each adjacent pair having a 90-degree angle.…”
Section: Related Workmentioning
confidence: 99%