Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems 2017
DOI: 10.1145/3025453.3026005
|View full text |Cite
|
Sign up to set email alerts
|

WatchSense

Abstract: Figure 1. (a) WatchSense enables on-and above-skin input on the back of the hand (BOH) through a wrist-worn depth sensor. (b) Our prototype mimics a smartwatch setup by attaching a small depth camera to the forearm. (c) It tracks the 3D position of fingertips, their identities, and touch on the BOH in real-time on consumer mobile devices. This enables a combination of mid-air and multitouch input for interactive applications on the move.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 63 publications
(15 citation statements)
references
References 42 publications
0
14
0
Order By: Relevance
“…For instance, it has been demonstrated that skin-based input [29] (tap on the skin) evokes higher SoA in users compared with typical keyboard-based input [17]. This finding may support application in skin-interaction smartwatches [70,79]. In other hand, speech input has been suggested to diminish SoA [45], which can provide major benefit in interface design.…”
Section: Application Of Agency Measures In Hci and Vrmentioning
confidence: 99%
“…For instance, it has been demonstrated that skin-based input [29] (tap on the skin) evokes higher SoA in users compared with typical keyboard-based input [17]. This finding may support application in skin-interaction smartwatches [70,79]. In other hand, speech input has been suggested to diminish SoA [45], which can provide major benefit in interface design.…”
Section: Application Of Agency Measures In Hci and Vrmentioning
confidence: 99%
“…Digits [11] requires an IR laser line projector whereas DigiTap [18] requires a LED flash synced with an accelerometer to detect vibrations occurring during finger taps. In WatchSense [24], the authors created a compact wearable prototype, attached to a user's forearm, to detect finger interaction from the other hand. Closest to our work, Chen et al [3] use elevated camera on the outer side of wrist to track 10 ASL hand poses.…”
Section: Vision Based Approaches On Wearable Devicesmentioning
confidence: 99%
“…Namely, when the hand intersects with another hand, or object, there are a range of interactions including, finger movement, touch, pinching or grasping. Consider, for example, in hand to hand interaction, the opisthenar area can act as a touchpad operated by the fingers of a second hand [24]. Such interactions can be extended to explore finger to finger interactions (such as clasping), or pinching or natural two-handed grasping actions.…”
Section: Future Workmentioning
confidence: 99%
“…Depth cameras [4,12,14] have usually been employed to create such surfaces, while other work have combined different sensor sources [13]. However, the touch classification, critical to the quality of the interaction remains challenging [14] and is traditionally addressed by hand-tuning parameters and thresholding.…”
Section: Introductionmentioning
confidence: 99%