2013
DOI: 10.1007/978-3-642-40480-1_18
|View full text |Cite
|
Sign up to set email alerts
|

User-Defined Gestures for Augmented Reality

Abstract: Abstract.Recently there has been an increase in research towards using hand gestures for interaction in the field of Augmented Reality (AR). These works have primarily focused on researcher designed gestures, while little is known about user preference and behavior for gestures in AR. In this paper, we present our guessability study for hand gestures in AR in which 800 gestures were elicited for 40 selected tasks from 20 participants. Using the agreement found among gestures, a user-defined gesture set was cre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
146
1
6

Year Published

2015
2015
2019
2019

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 142 publications
(155 citation statements)
references
References 18 publications
2
146
1
6
Order By: Relevance
“…Our results were obtained by adapting the familiar methodology of Fitts' law studies along with a measurement of individuation adopted from motor control research. This is in contrast to existing work in gesture design that has considered elicitation methods to learn about user preferences, intuitiveness, and social acceptability [23,26,31].…”
Section: Discussionmentioning
confidence: 90%
See 1 more Smart Citation
“…Our results were obtained by adapting the familiar methodology of Fitts' law studies along with a measurement of individuation adopted from motor control research. This is in contrast to existing work in gesture design that has considered elicitation methods to learn about user preferences, intuitiveness, and social acceptability [23,26,31].…”
Section: Discussionmentioning
confidence: 90%
“…High performance is decisive in activities like text entry, virtual reality, command selection, and gaming. However, previous work has focused on eliciting intuitive multi-finger gestures from users [23,26]. This leaves out many issues, including performance characteristics of gestures involving single and multiple fingers simultaneously.…”
Section: Introductionmentioning
confidence: 99%
“…It should support such actions as: scaling; navigating in visualized 3D space; selecting sub-spaces, objects, groups of visual elements (flow/path elements) and views; manipulating and placing; planning routes of view; generating, extracting and collecting data (based on the reviewed visualized data). A novel system should allow multimodal control by voice and/or gestures in order to make it more intuitive for users as it is shown in [188][189][190] and [191]. Nevertheless, one of the main issues regarding this direction of development is the fact that implementing effective gestural and voice interaction is not a trivial matter.…”
Section: Future Research Agenda and Data Visualization Challengesmentioning
confidence: 99%
“…Gesture recognition taxonomy was implemented referencing the work of HitLabNZ, University of Christchurch, NZ. Their study 'User-defined gestures for augmented reality' [5] records extensive blind testing of various hand poses or 'tasks' that might be implemented within an AR related interface. Using this information three distinct tasks were implemented in the Xuni system animation shown in Table 1.…”
Section: Visualizationmentioning
confidence: 99%