When planning target-directed reaching movements, human subjects combine visual and proprioceptive feedback to form two estimates of the arm's position: one to plan the reach direction, and another to convert that direction into a motor command. These position estimates are based on the same sensory signals but rely on different combinations of visual and proprioceptive input, suggesting that the brain weights sensory inputs differently depending on the computation being performed. Here we show that the relative weighting of vision and proprioception depends both on the sensory modality of the target and on the information content of the visual feedback, and that these factors affect the two stages of planning independently. The observed diversity of weightings demonstrates the flexibility of sensory integration and suggests a unifying principle by which the brain chooses sensory inputs so as to minimize errors arising from the transformation of sensory signals between coordinate frames.
When planning goal-directed reaches, subjects must estimate the position of the arm by integrating visual and proprioceptive signals from the sensory periphery. These integrated position estimates are required at two stages of motor planning: first to determine the desired movement vector, and second to transform the movement vector into a joint-based motor command. We quantified the contributions of each sensory modality to the position estimate formed at each planning stage. Subjects made reaches in a virtual reality environment in which vision and proprioception were dissociated by shifting the location of visual feedback. The relative weighting of vision and proprioception at each stage was then determined using computational models of feedforward motor control. We found that the position estimate used for movement vector planning relies mostly on visual input, whereas the estimate used to compute the joint-based motor command relies more on proprioceptive signals. This suggests that when estimating the position of the arm, the brain selects different combinations of sensory input based on the computation in which the resulting estimate will be used.
Birdsong is a learned behavior remarkable for its high degree of stereotypy. Nevertheless, adult birds display substantial rendition-byrendition variation in the structure of individual song elements or "syllables." Previous work suggests that some of this variation is actively generated by the avian basal ganglia circuitry for purposes of motor exploration. However, it is unknown whether and how natural variations in premotor activity drive variations in syllable structure. Here, we recorded from the premotor nucleus robust nucleus of the arcopallium (RA) in Bengalese finches and measured whether neural activity covaried with syllable structure across multiple renditions of individual syllables. We found that variations in premotor activity were significantly correlated with variations in the acoustic features (pitch, amplitude, and spectral entropy) of syllables in approximately a quarter of all cases. In these cases, individual neural recordings predicted 8.5 Ϯ 0.3% (mean Ϯ SE) of the behavioral variation, and in some cases accounted for 25% or more of trial-by-trial variations in acoustic output. The prevalence and strength of neuron-behavior correlations indicate that each acoustic feature is controlled by a large ensemble of neurons that vary their activity in a coordinated manner. Additionally, we found that correlations with pitch (but not other features) were predominantly positive in sign, supporting a model of pitch production based on the anatomy and physiology of the vocal motor apparatus. Collectively, our results indicate that trial-by-trial variations in spectral structure are indeed under central neural control at the level of RA, consistent with the idea that such variation reflects motor exploration.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.