Reaching and grasping in primates depend on the coordination of neural activity in large frontoparietal ensembles. Here we demonstrate that primates can learn to reach and grasp virtual objects by controlling a robot arm through a closed-loop brain–machine interface (BMIc) that uses multiple mathematical models to extract several motor parameters (i.e., hand position, velocity, gripping force, and the EMGs of multiple arm muscles) from the electrical activity of frontoparietal neuronal ensembles. As single neurons typically contribute to the encoding of several motor parameters, we observed that high BMIc accuracy required recording from large neuronal ensembles. Continuous BMIc operation by monkeys led to significant improvements in both model predictions and behavioral performance. Using visual feedback, monkeys succeeded in producing robot reach-and-grasp movements even when their arms did not move. Learning to operate the BMIc was paralleled by functional reorganization in multiple cortical areas, suggesting that the dynamic properties of the BMIc were incorporated into motor and sensory cortical representations.
Brain-machine interfaces (BMIs)1,2 use neuronal activity recorded from the brain to establish direct communication with external actuators, such as prosthetic arms. While BMIs aim to restore the normal sensorimotor functions of the limbs, so far they have lacked tactile sensation. Here we demonstrate the operation of a brain-machine-brain interface (BMBI) that both controls the exploratory reaching movements of an actuator and enables the signalling of artificial tactile feedback through intracortical microstimulation (ICMS) of the primary somatosensory cortex (S1). Monkeys performed an active-exploration task in which an actuator (a computer cursor or a virtual-reality hand) was moved using a BMBI that derived motor commands from neuronal ensemble activity recorded in primary motor cortex (M1). ICMS feedback occurred whenever the actuator touched virtual objects. Temporal patterns of ICMS encoded the artificial tactile properties of each object. Neuronal recordings and ICMS epochs were temporally multiplexed to avoid interference. Two monkeys operated this BMBI to search and discriminate one out of three visually undistinguishable objects, using the virtual hand to identify the unique artificial texture (AT) associated with each. These results suggest that clinical motor neuroprostheses might benefit from the addition of ICMS feedback to generate artificial somatic perceptions associated with mechanical, robotic, or even virtual prostheses.
Monkeys can learn to directly control the movements of an artificial actuator by using a brain-machine interface (BMI) driven by the activity of a sample of cortical neurons. Eventually, they can do so without moving their limbs. Neuronal adaptations underlying the transition from control of the limb to control of the actuator are poorly understood. Here, we show that rapid modifications in neuronal representation of velocity of the hand and actuator occur in multiple cortical areas during the operation of a BMI. Initially, monkeys controlled the actuator by moving a hand-held pole. During this period, the BMI was trained to predict the actuator velocity. As the monkeys started using their cortical activity to control the actuator, the activity of individual neurons and neuronal populations became less representative of the animal's hand movements while representing the movements of the actuator. As a result of this adaptation, the animals could eventually stop moving their hands yet continue to control the actuator. These results show that, during BMI control, cortical ensembles represent behaviorally significant motor parameters, even if these are not associated with movements of the animal's own limb.
Neurophysiological, neuroimaging, and lesion studies point to a highly distributed processing of temporal information by cortico-basal ganglia-thalamic networks. However, there are virtually no experimental data on the encoding of behavioral time by simultaneously recorded cortical ensembles. We predicted temporal intervals from the activity of hundreds of neurons recorded in motor and premotor cortex as rhesus monkeys performed self-timed hand movements. During the delay periods, when animals had to estimate temporal intervals and prepare hand movements, neuronal ensemble activity encoded both the time that elapsed from the previous hand movement and the time until the onset of the next. The neurons that were most informative of these temporal intervals increased or decreased their rates throughout the delay until reaching a threshold value, at which point a movement was initiated. Variability in the self-timed delays was explainable by the variability of neuronal rates, but not of the threshold. In addition to predicting temporal intervals, the same neuronal ensemble activity was informative for generating predictions that dissociated the delay periods of the task from the movement periods. Left hemispheric areas were the best source of predictions in one bilaterally implanted monkey overtrained to perform the task with the right hand. However, after that monkey learned to perform the task with the left hand, its left hemisphere continued and the right hemisphere started contributing to the prediction. We suggest that decoding of temporal intervals from bilaterally recorded cortical ensembles could improve the performance of neural prostheses for restoration of motor function.
Brain machine interfaces (BMIs) are devices that convert neural signals into commands to directly control artificial actuators, such as limb prostheses. Previous real-time methods applied to decoding behavioral commands from the activity of populations of neurons have generally relied upon linear models of neural tuning and were limited in the way they used the abundant statistical information contained in the movement profiles of motor tasks. Here, we propose an n-th order unscented Kalman filter which implements two key features: (1) use of a non-linear (quadratic) model of neural tuning which describes neural activity significantly better than commonly-used linear tuning models, and (2) augmentation of the movement state variables with a history of n-1 recent states, which improves prediction of the desired command even before incorporating neural activity information and allows the tuning model to capture relationships between neural activity and movement at multiple time offsets simultaneously. This new filter was tested in BMI experiments in which rhesus monkeys used their cortical activity, recorded through chronically implanted multielectrode arrays, to directly control computer cursors. The 10th order unscented Kalman filter outperformed the standard Kalman filter and the Wiener filter in both off-line reconstruction of movement trajectories and real-time, closed-loop BMI operation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.