Fig. 1. Snapshots of our online marker-based hand tracking system on sequences with two-handed and hand-object interactions. We demonstrate a novel marker-labeling and tracking system that enables fully-automatic, real-time estimation of hand poses in challenging interaction scenarios with frequent occlusions. Markers labeled as left hand and right hand are rendered as orange and blue spheres respectively, while markers associated with predefined rigid bodies are rendered as green spheres.Optical marker-based motion capture is the dominant way for obtaining high-fidelity human body animation for special effects, movies, and video games. However, motion capture has seen limited application to the human hand due to the difficulty of automatically identifying (or labeling) identical markers on self-similar fingers. We propose a technique that frames the labeling problem as a keypoint regression problem conducive to a solution using convolutional neural networks. We demonstrate robustness of our labeling solution to occlusion, ghost markers, hand shape, and even motions involving two hands or handheld objects. Our technique is equally applicable to sparse or dense marker sets and can run in real-time to support interaction prototyping with high-fidelity hand tracking and hand presence in virtual reality.
We investigate multi-stroke marking menus for multitouch devices and we show that using two hands can improve performance. We present two new two-handed multi-stroke marking menu variants in which users either draw strokes with both hands simultaneously or alternate strokes between hands. In a pair of studies we find that using two hands simultaneously is faster than using a single, dominant-handed marking menu by 10-15%. Alternating strokes between hands doubles the number of accessible menu items for the same number of strokes, and is similar in performance to using a one-handed marking menu. We also examine how stroke direction affects performance. When using thumbs on an iPod Touch, drawing strokes upwards and inwards is faster than other directions. For two-handed simultaneous menus, stroke pairs that are bilaterally symmetric or share the same direction are fastest. We conclude with design guidelines and sample applications to aid multitouch application developers interested in using one-and two-handed marking menus.
Proton++ is a declarative multitouch framework that allows developers to describe multitouch gestures as regular expressions of touch event symbols. It builds on the Proton framework by allowing developers to incorporate custom touch attributes directly into the gesture description. These custom attributes increase the expressivity of the gestures, while preserving the benefits of Proton: automatic gesture matching, static analysis of conflict detection, and graphical gesture creation. We demonstrate Proton++'s flexibility with several examples: a direction attribute for describing trajectory, a pinch attribute for detecting when touches move towards one another, a touch area attribute for simulating pressure, an orientation attribute for selecting menu items, and a screen location attribute for simulating hand ID. We also use screen location to simulate user ID and enable simultaneous recognition of gestures by multiple users. In addition, we show how to incorporate timing into Proton++ gestures by reporting touch events at a regular time interval. Finally, we present a user study that suggests that users are roughly four times faster at interpreting gestures written using Proton++ than those written in procedural event-handling code commonly used today.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.