2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob) 2018
DOI: 10.1109/biorob.2018.8487923
|View full text |Cite
|
Sign up to set email alerts
|

Visual Cues to Improve Myoelectric Control of Upper Limb Prostheses

Abstract: The instability of myoelectric signals over time complicates their use to control highly articulated prostheses. To address this problem, studies have tried to combine surface electromyography with modalities that are less affected by the amputation and environment, such as accelerometry or gaze information. In the latter case, the hypothesis is that a subject looks at the object he or she intends to manipulate and that knowing this object's affordances allows to constrain the set of possible grasps. In this p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
17
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2
2

Relationship

2
7

Authors

Journals

citations
Cited by 14 publications
(19 citation statements)
references
References 20 publications
2
17
0
Order By: Relevance
“…The success of this approach relies however on the ability to distinguish informative fixations from those that are not necessarily related to any grasp intent. Gigli et al (2018) attempted to address this problem by including the onset of the arm movement as an additional condition, which we have shown here to shorten the window of opportunity. Also the method that is used to detect fixations may shorten this window.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The success of this approach relies however on the ability to distinguish informative fixations from those that are not necessarily related to any grasp intent. Gigli et al (2018) attempted to address this problem by including the onset of the arm movement as an additional condition, which we have shown here to shorten the window of opportunity. Also the method that is used to detect fixations may shorten this window.…”
Section: Discussionmentioning
confidence: 99%
“…Several studies have attempted to explore this proactivity to help disabled people, such as in a robot assistant scenario (Admoni and Srinivasa, 2016; Koochaki and Najafizadeh, 2018; Saran et al, 2018). Another compelling use-case is the control of dexterous upper-limb prostheses (Castellini and Sandini, 2006; Markovic et al, 2014, 2015; Gigli et al, 2018), where deciphering the grasp intent from myoelectric activations alone can be challenging. The integration of gaze and vision as contextual information could be helpful especially during the initial transient phase of a movement.…”
Section: Introductionmentioning
confidence: 99%
“…Including reach-to-grasp and release phases into the models can as well improve the prostheses, making them more similar to real hands and possibly empowered with multis-sensorial inputs (e.g. [4446]). Finally, if the results are matched with muscular correspondents, they may lead to real natural and continuous myoelectric control of robotic hands, a challenge not yet achieved to the best of our knowledge.…”
Section: Discussionmentioning
confidence: 99%
“…This type of sensor fusion which combines vision and proprioceptive information is intensively used in biomedical applications, such as in the transradial prosthetic domain, to improve control performance (Markovic et al, 2014(Markovic et al, , 2015, or to focus on recognizing objects during grasping to adjust the movements (Došen et al, 2010). This last task can also use Convolutional Neural Networks (CNNs) as feature extractors (Ghazaei et al, 2017;Gigli et al, 2018).…”
Section: Introductionmentioning
confidence: 99%