2021
DOI: 10.48550/arxiv.2112.05986
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Acoustic Sensing-based Hand Gesture Detection for Wearable Device Interaction

Abstract: Hand gesture recognition attracts great attention for interaction since it is intuitive and natural to perform. In this paper, we explore a novel method for interaction by using boneconducted sound generated by finger movements while performing gestures. We design a set of gestures that generate unique sound features, and capture the resulting sound from the wrist using a commodity microphone. Next, we design a sound event detector and a recognition model to classify the gestures. Our system achieves an overal… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…For instance, Chirp, [36] a startup company, has leveraged micro-electromechanical systems (MEMS) and the time-of-flight ultrasound sensor for touch-free HGR. Recently, IBM [23] has developed ultrasound-based microphones to detect finger bone-conducted sound generation during hand gestures. The ultrasound array has also shown high recognition accuracy in deciphering distinctive gestures and types of digit flexion using traditional imaging modalities.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…For instance, Chirp, [36] a startup company, has leveraged micro-electromechanical systems (MEMS) and the time-of-flight ultrasound sensor for touch-free HGR. Recently, IBM [23] has developed ultrasound-based microphones to detect finger bone-conducted sound generation during hand gestures. The ultrasound array has also shown high recognition accuracy in deciphering distinctive gestures and types of digit flexion using traditional imaging modalities.…”
Section: Introductionmentioning
confidence: 99%
“…[3,[13][14][15] As a primary mode to understand and mimic humanhuman interaction since the beginning of human cognition, hand gesture recognition (HGR) has been the holy grail of HRI because of its efficient communication in rugged environments that are difficult with verbal or facial expressions (e.g., construction sites and emergency rescues). [3,[16][17][18][19][20][21][22][23] Conventional methods for HGR typically rely on the use of visual or infrared cameras [24] electromyography (EMG) measurements, [24,25] or stretchable strain sensors [26] with resources-intensive machine learning algorithms to decipher the gestures. However, the accuracy of HGR has been hindered because of the limited visual image quality due to environmental interference, poor contact impedance, and lowquality data due to cross-talks in EMG and strain sensors.…”
Section: Introductionmentioning
confidence: 99%