2023
DOI: 10.1109/lsens.2023.3301837
|View full text |Cite
|
Sign up to set email alerts
|

Robust Fusion Model for Handling EMG and Computer Vision Data in Prosthetic Hand Control

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 12 publications
0
0
0
Order By: Relevance
“…Moreover, the accuracy values in grasp classification are higher than or generally consistent with what has been observed in the literature [17,19], despite a substantial increase in the number of grasps [20,34] and the management of the prosthetic wrist orientation [21,22].…”
Section: Resultssupporting
confidence: 86%
See 1 more Smart Citation
“…Moreover, the accuracy values in grasp classification are higher than or generally consistent with what has been observed in the literature [17,19], despite a substantial increase in the number of grasps [20,34] and the management of the prosthetic wrist orientation [21,22].…”
Section: Resultssupporting
confidence: 86%
“…Recent studies explored the use of a multimodal system with EMG and CVS in prosthetic control [20,21], proposing a strategy for combining these two types of information to improve the accuracy with respect to an EMG classifier. The hand gesture determined based on information gathered from an environmental CVS was used as an additional feature alongside those extracted from EMG signals to infer the final grasp to be executed [22], neglecting wrist orientation. From the state-of-the-art analysis, the need for a novel SCS that manages both hand and wrist configuration by simultaneously taking into account the user's motion intention is therefore evident.…”
Section: Introductionmentioning
confidence: 99%