Recent studies show that the amplitude of cortical field potentials is modulated in the time domain by grasping kinematics. However, it is unknown if these low frequency modulations persist and contain enough information to decode grasp kinematics in macro-scale activity measured at the scalp via electroencephalography (EEG). Further, it is unclear as to whether joint angle velocities or movement synergies are the optimal kinematics spaces to decode. In this offline decoding study, we infer from human EEG, hand joint angular velocities as well as synergistic trajectories as subjects perform natural reach-to-grasp movements. Decoding accuracy, measured as the correlation coefficient (r) between the predicted and actual movement kinematics, was r = 0.49 ± 0.02 across 15 hand joints. Across the first three kinematic synergies, decoding accuracies were r = 0.59 ± 0.04, 0.47 ± 0.06, and 0.32 ± 0.05. The spatial-temporal pattern of EEG channel recruitment showed early involvement of contralateral frontal-central scalp areas followed by later activation of central electrodes over primary sensorimotor cortical areas. Information content in EEG about the grasp type peaked at 250 ms after movement onset. The high decoding accuracies in this study are significant not only as evidence for time-domain modulation in macro-scale brain activity, but for the field of brain-machine interfaces as well. Our decoding strategy, which harnesses the neural “symphony” as opposed to local members of the neural ensemble (as in intracranial approaches), may provide a means of extracting information about motor intent for grasping without the need for penetrating electrodes and suggests that it may be soon possible to develop non-invasive neural interfaces for the control of prosthetic limbs.
We investigated how well repetitive finger tapping movements can be decoded from scalp electroencephalography (EEG) signals. A linear decoder with memory was used to infer continuous index finger angular velocities from the low-pass filtered fluctuations of the amplitude of a plurality of EEG signals distributed across the scalp. To evaluate the accuracy of the decoder, the Pearson's correlation coefficient (r) between the observed and predicted trajectories was calculated in a 10-fold cross-validation scheme. We also assessed attempts to decode finger kinematics from EEG data that was cleaned with independent component analysis (ICA), EEG data from peripheral sensors, and EEG data from rest periods. A genetic algorithm (GA) was used to select combinations of EEG channels that maximized decoding accuracies. Our results (lower quartile r = 0.18, median r = 0.36, upper quartile r = 0.50) show that delta-band EEG signals contain useful information that can be used to infer finger kinematics. Further, the highest decoding accuracies were characterized by highly correlated delta band EEG activity mostly localized to the contralateral central areas of the scalp. Spectral analysis of EEG also showed bilateral alpha band (8–13 Hz) event related desynchronizations (ERDs) and contralateral beta band (20–30 Hz) event related synchronizations (ERSs) localized over central scalp areas. Overall, this study demonstrates the feasibility of decoding finger kinematics from scalp EEG signals.
With continued research on brain machine interfaces (BMIs), it is now possible to control prosthetic arm position in space to a high degree of accuracy. However, a reliable decoder to infer the dexterous movements of fingers from brain activity during a natural grasping motion is still to be demonstrated. Here, we present a methodology to accurately predict and reconstruct natural hand kinematics from non-invasively recorded scalp electroencephalographic (EEG) signals during object grasping movements. The high performance of our decoder is attributed to a combination of the correct input space (time-domain amplitude modulation of delta-band smoothed EEG signals) and an optimal subset of EEG electrodes selected using a genetic algorithm. Trajectories of the joint angles were reconstructed for metacarpo-phalangeal (MCP) joints of the fingers as well as the carpo-metacarpal (CMC) and MCP joints of the thumb. High decoding accuracy (Pearson's correlation coefficient, r) between the predicted and observed trajectories (r = 0.76 ± 0.01; averaged across joints) indicate that this technique may be suitable for use with a closed-loop real-time BMI to control grasping motion in prosthetics with high degrees of freedom. This demonstrates the first successful decoding of hand pre-shaping kinematics from noninvasive neural signals.
To harness the increased dexterity and sensing capabilities in advanced prosthetic device designs, amputees will require interfaces supported by novel forms of sensory feedback and novel control paradigms. We are using a motorized elbow brace to feed back grasp forces to the user in the form of extension torques about the elbow. This force display complements myoelectric control of grip closure in which EMG signals are drawn from the biceps muscle. We expect that the action/reaction coupling experienced by the biceps muscle will produce an intuitive paradigm for object manipulation, and we hope to uncover neural correlates to support this hypothesis. In this paper we present results from an experiment in which 7 able-bodied persons attempted to distinguish three objects by stiffness while grasping them under myoelectric control and feeling reaction forces displayed to their elbow. In four conditions (with and without force display, and using biceps myoelectric signals ipsilateral and contralateral to the force display,) ability to correctly identify objects was significantly increased with sensory feedback.
Shared control is emerging as a likely strategy for controlling neuroprosthetic devices, in which users specify high level goals but the low-level implementation is carried out by the machine. In this context, predicting the discrete goal is necessary. Although grasping various objects is critical in determining independence in daily life of amputees, decoding of different grasp types from noninvasively recorded brain activity has not been investigated. Here we show results suggesting electroencephalography (EEG) is a feasible modality to extract information on grasp types from the user’s brain activity. We found that the information about the intended grasp increases over the grasping movement, and is significantly greater than chance up to 200 ms before movement onset.
Brain-neural machine interfaces (BNMIs) are systems that allow a user to control an artificial device, such as a computer cursor or a robotic limb, through imagined movements that are measured as neural activity. They provide the potential to restore mobility for those with motor deficiencies caused by stroke, spinal cord injury, or limb amputations. Such systems would have been considered a topic of science fiction a few decades ago but are now being increasingly developed in both research and industry. Workers in this area are charged with fabricating BNMIs that are safe, effective, easy to use, and affordable for clinical populations. Because of the rapid growth of this new field, however, many issues with development, ethics, metrics, and use need to be addressed. To bring together some of the leading minds, researchers, and critics of the neural prosthetics field to discuss the current state of BNMIs, the 2013 International Workshop on Clinical Brain-Neural Machine Interface Systems was held 24-27 February 2013 in Houston, Texas [1]-[3]. Attendees included researchers from academic and government agencies; program managers from the U.S. Food and Drug Administration (FDA), the National Institutes of Health (NIH), the National Aeronautics and Space Administration (NASA), and the Defense Advanced Research Projects Agency (DARPA); as well as top industry representatives; end users of BNMI technology; medical centers and science media representatives; and selected graduate students and postdoctoral researchers (postdocs).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.