Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology 2014
DOI: 10.1145/2642918.2647373
|View full text |Cite
|
Sign up to set email alerts
|

In-air gestures around unmodified mobile devices

Abstract: Figure 1: Touch input is expressive but can occlude large parts of the screen (A). We propose a machine learning based algorithm for gesture recognition expanding the interaction space around the mobile device (B), adding in-air gestures and hand-part tracking (D) to commodity off-the-shelf mobile devices, relying only on the device's camera (and no hardware modifications). We demonstrate a number of compelling interactive scenarios including bi-manual input to mapping and gaming applications (C+D). The algori… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
56
1

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 149 publications
(57 citation statements)
references
References 44 publications
0
56
1
Order By: Relevance
“…GestureWatch [21] and HoverFlow [22] are both systems that recognize in-air hand gestures performed over mobile or wearable devices relying on infrared proximity sensors. Hand gestures can also be recognized from a single RGB camera found on most mobile devices [40]. WatchMe [41] is a camera-based system that can track a pen or a laser pointer on a drawing canvas and use this as an input modality.…”
Section: Gestural Interaction With Wearablesmentioning
confidence: 99%
“…GestureWatch [21] and HoverFlow [22] are both systems that recognize in-air hand gestures performed over mobile or wearable devices relying on infrared proximity sensors. Hand gestures can also be recognized from a single RGB camera found on most mobile devices [40]. WatchMe [41] is a camera-based system that can track a pen or a laser pointer on a drawing canvas and use this as an input modality.…”
Section: Gestural Interaction With Wearablesmentioning
confidence: 99%
“…All these approaches require the user to wear specialized sensor electronics and might not be feasible in real-world scenarios. Song et al [13] recently introduced a data-driven gesture recognition approach that enables mid-air interaction on unmodified portable devices. However, the type of interactions are limited to 2D gestures.…”
Section: Related Workmentioning
confidence: 99%
“…Our method is similar to those proposed in [2,13]. In [13] several randomized decision forests (RFs) are combined to enable the recognition of discrete 2D hand shapes or gestures.…”
Section: Spatial Interaction For Wearable Computingmentioning
confidence: 99%
See 2 more Smart Citations