2017 IEEE Life Sciences Conference (LSC) 2017
DOI: 10.1109/lsc.2017.8268144
|View full text |Cite
|
Sign up to set email alerts
|

Hand gestures recognition using electromyography for bilateral upper limb rehabilitation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
2
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…These results encourage future developments of the library, which may include tools to automatically segment, classify and quantify sequence of hand movements, such as proposed in de Souza Baptista et al (2017). Incorporating other input modalities, such as Wang et al (2021) and Aguiar and Bo (2017), is also an alternative, as well as integrating other rehabilitation technologies Cardoso et al (2022), particularly due to the potential therapeutic benefits.…”
Section: Discussionmentioning
confidence: 99%
“…These results encourage future developments of the library, which may include tools to automatically segment, classify and quantify sequence of hand movements, such as proposed in de Souza Baptista et al (2017). Incorporating other input modalities, such as Wang et al (2021) and Aguiar and Bo (2017), is also an alternative, as well as integrating other rehabilitation technologies Cardoso et al (2022), particularly due to the potential therapeutic benefits.…”
Section: Discussionmentioning
confidence: 99%
“…Fall et al [15] developed a wearable control interface for people who have a weak arm, hands and fingers or are unable to control these body parts but have the ability to control their head/head-part and shoulder. Aguiar and Bó [16] developed a wearable EMG-based system for people with upper limb disabilities. Altakrouri, Burmeister, Boldt and Schrader [17] developed a similar system for people suffering from hand and arm impairment.…”
Section: Dash Human-computer Interactionmentioning
confidence: 99%