Companion Publication of the 2021 International Conference on Multimodal Interaction 2021
DOI: 10.1145/3461615.3485428
|View full text |Cite
|
Sign up to set email alerts
|

Knock&Tap: Classification and Localization of Knock and Tap Gestures using Deep Sound Transfer Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
0
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 16 publications
0
0
0
Order By: Relevance
“…The localisation error of their proposal was 10.2cm. More recently, in 2021, Jeong et al [44] presented 'Knock&Tap', an audio-based approach capable of performing gesture classification and gesture localisation through deep transfer learning. The proposal comprises a single 4-microphone array to record the sound of the user's knocking and tapping gestures on a wood/glass panel.…”
Section: Tactile Perception Based On Acoustic Sensingmentioning
confidence: 99%
See 1 more Smart Citation
“…The localisation error of their proposal was 10.2cm. More recently, in 2021, Jeong et al [44] presented 'Knock&Tap', an audio-based approach capable of performing gesture classification and gesture localisation through deep transfer learning. The proposal comprises a single 4-microphone array to record the sound of the user's knocking and tapping gestures on a wood/glass panel.…”
Section: Tactile Perception Based On Acoustic Sensingmentioning
confidence: 99%
“…They reported an error in the localisation of 1.45cm on the x-axis and 2.72cm on the y-axis. The last two systems [44,45] presented the same drawback: they must preprocess the sound signals offline, meaning that they do not include a touch activity detection phase in their pipeline.…”
Section: Tactile Perception Based On Acoustic Sensingmentioning
confidence: 99%