Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology 2019
DOI: 10.1145/3332165.3347867
|View full text |Cite
|
Sign up to set email alerts
|

Opisthenar

Abstract: We introduce a vision-based technique to recognize static hand poses and dynamic finger tapping gestures. Our approach employs a camera on the wrist, with a view of the opisthenar (back of the hand) area. We envisage such cameras being included in a wrist-worn device such as a smartwatch, fitness tracker or wristband. Indeed, selected off-the-shelf smartwatches now incorporate a built-in camera on the side for photography purposes. However, in this configuration, the fingers are occluded from the view of the c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 37 publications
(6 citation statements)
references
References 26 publications
0
5
0
Order By: Relevance
“…Dove et al [209] surveyed to understand how design innovation is practiced in the ML domain in terms of user experience. [63], [70], [165], [174], [211] [124], [125], [128], [175], [217] [69], [112], [113], [118], [130], [131], [133], [135], [137], [140], [144], [152], [153], [155] [49], [57], [67], [98], [110], [114], [180], [192], [193], [203], [214], [215] [48], [59], [62], [75], [79], [111], [120], [156], [159], [185], [195], [197], [207], [218] [65], [83],…”
Section: Survey Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Dove et al [209] surveyed to understand how design innovation is practiced in the ML domain in terms of user experience. [63], [70], [165], [174], [211] [124], [125], [128], [175], [217] [69], [112], [113], [118], [130], [131], [133], [135], [137], [140], [144], [152], [153], [155] [49], [57], [67], [98], [110], [114], [180], [192], [193], [203], [214], [215] [48], [59], [62], [75], [79], [111], [120], [156], [159], [185], [195], [197], [207], [218] [65], [83],…”
Section: Survey Resultsmentioning
confidence: 99%
“…Sterman et al [129] developed a tool to visualize and model writing styles using Deep Learning. Opisthenar [130] is a Deep Learning based tool to recognize head poses and finger tappings. Zhang et al [131] propose several techniques to correct errors more efficiently when typing on mobile phone keyboards.…”
Section: Applicationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Based on the ability to capture information in a large area, it has become common to use vision sensors in the form of wearable devices to achieve hand-based interactions. Such sensors have been mainly attached to locations around the hand, such as fingers [8,40] and wrists [23,25,41,42,49,56,60], to obtain a desirable field of view for seeing the hand shape without interfering with hand movement. Nevertheless, these solutions are specifically designed for detecting the hand posture and cannot be applicable for detecting the hand location.…”
Section: Hand Posture Recognitionmentioning
confidence: 99%
“…For detecting hand posture, vision-based wearable devices have become the focus of many works due to recent advancements in computer vision. For example, Opisthenar is a wrist-worn camera used to classify the hand's posture by seeing the back of the hand [60]. CyclopsRing is a ring-style wearable fisheye camera device that can distinguish the hand postures by analyzing the unique view caused by different postures [8].…”
Section: Introductionmentioning
confidence: 99%