2022
DOI: 10.1007/978-3-030-70601-2_243
|View full text |Cite
|
Sign up to set email alerts
|

Multi-label EMG Classification of Isotonic Hand Movements: A Suitable Method for Robotic Prosthesis Control

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…Prior approaches to multi-label classification have primarily focused on the generation of composite gestures from individual components such as the construction of a fist gesture from the collective flexion of individuals digits [206], [207] or the classification of multiple axes of action of a single joint [200], [201]. In contrast, we focus on combining gestures that could be used for joint actions leveraging the knowledge of limb biomechanics to construct sub groups of gestures that are mutually exclusive in their actions.…”
Section: Model Selection For Gesture Expressivitymentioning
confidence: 99%
See 1 more Smart Citation
“…Prior approaches to multi-label classification have primarily focused on the generation of composite gestures from individual components such as the construction of a fist gesture from the collective flexion of individuals digits [206], [207] or the classification of multiple axes of action of a single joint [200], [201]. In contrast, we focus on combining gestures that could be used for joint actions leveraging the knowledge of limb biomechanics to construct sub groups of gestures that are mutually exclusive in their actions.…”
Section: Model Selection For Gesture Expressivitymentioning
confidence: 99%
“…To our knowledge, only a single study has investigated multi-label classification using a problem transformation approach. In that study, separate classifiers for flexion and extension of each digit were employed to construct multi-digit movements, with application to prosthetic control [207]. To collect training data, these authors chose a small set of natural gestures; each gesture is labeled according to the flexion or extension of each finger, and they attempt to interpolate in order to classify the unseen single gesture classes.…”
Section: Introductionmentioning
confidence: 99%