Wearable Robotics 2020
DOI: 10.1016/b978-0-12-814659-0.00021-7
|View full text |Cite
|
Sign up to set email alerts
|

The Modular Prosthetic Limb

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
32
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 25 publications
(32 citation statements)
references
References 26 publications
0
32
0
Order By: Relevance
“…Bimanual gesture combinations were predicted using two hierarchical linear classifiers, each trained for a specific hand using inputs from the contralateral motor and somatosensory cortices. Furthermore, by mapping a set of hand gestures to four directions on a 2D plane, the participant was able to use the gesture classifiers to simultaneously control two JHU/APL Modular Prosthetic limbs (MPLs;Johannes et al, 2011Johannes et al, , 2020 in center-out movement tasks. Online performance was explored as a function of the complexity of bilateral gesture combinations.…”
Section: Introductionmentioning
confidence: 99%
“…Bimanual gesture combinations were predicted using two hierarchical linear classifiers, each trained for a specific hand using inputs from the contralateral motor and somatosensory cortices. Furthermore, by mapping a set of hand gestures to four directions on a 2D plane, the participant was able to use the gesture classifiers to simultaneously control two JHU/APL Modular Prosthetic limbs (MPLs;Johannes et al, 2011Johannes et al, , 2020 in center-out movement tasks. Online performance was explored as a function of the complexity of bilateral gesture combinations.…”
Section: Introductionmentioning
confidence: 99%
“…Online decoding was achieved by binning the incoming neural signal and averaging the normalized firing rates from each electrode across a 240 ms buffer. A gesture prediction from the neural signal was made every 30 ms and the output was transmitted using a custom software interface to send control commands to the robot control system controlling the MPL (Johannes et al, 2020 ).…”
Section: Methodsmentioning
confidence: 99%
“…Researchers have demonstrated bimanual control of virtual arms using neural signals from bilateral frontal and parietal cortical areas in non-human primates (Ifft et al, 2013 ). Prior work has demonstrated effective control of seven (Collinger et al, 2013 ) and even up to 10 (Wodlinger et al, 2015 ) DOFs in individuals using an invasive cortical BMI to move anthropomorphic robotic limbs; however, despite these impressive advances the need for bimanual control of two robotic limbs for more complex tasks of daily living requires control over as many as 34 DOFs, if using highly dexterous robotic limbs (Johannes et al, 2020 ). To address this challenge, advanced strategies, such as shared control, could help significantly reduce the required DOFs needed to effectively complete tasks requiring two arms while using a BMI.…”
Section: Introductionmentioning
confidence: 99%
“…Their prosthesis is a mixture of sensors, actuators, neuroscience and complex software with the goal of creating a novel prosthesis [20]. Despite their serious injuries, the modular prosthetic limbs allow upper extremity amputees to perform ADL [21,22]. The University of Utah has also developed a prototype arm called LUKE (Figure 5), which has the ability to interact with the nerves of the amputee and introduce touch into prosthesis through the use of advanced interfaces between the machine and body [24].…”
Section: Current Prosthetic Handsmentioning
confidence: 99%