To explore the extent to which functional systems within the human posterior parietal cortex and the superior temporal sulcus are involved in the perception of action, we measured cerebral metabolic activity in human subjects by positron emission tomography during the perception of simulations of biological motion with point-light displays. The experimental design involved comparisons of activity during the perception of goal-directed hand action, whole body motion, object motion, and random motion. The results demonstrated that the perception of scripts of goal-directed hand action implicates the cortex in the intraparietal sulcus and the caudal part of the superior temporal sulcus, both in the left hemisphere. By contrast, the rostrocaudal part of the right superior temporal sulcus and adjacent temporal cortex, and limbic structures such as the amygdala, are involved in the perception of signs conveyed by expressive body movements.
Motor learning is dependent upon plasticity in motor areas of the brain, but does it occur in isolation, or does it also result in changes to sensory systems? We examined changes to somatosensory function that occur in conjunction with motor learning. We found that even after periods of training as brief as 10 min, sensed limb position was altered and the perceptual change persisted for 24 h. The perceptual change was reflected in subsequent movements; limb movements following learning deviated from the prelearning trajectory by an amount that was not different in magnitude and in the same direction as the perceptual shift. Crucially, the perceptual change was dependent upon motor learning. When the limb was displaced passively such that subjects experienced similar kinematics but without learning, no sensory change was observed. The findings indicate that motor learning affects not only motor areas of the brain but changes sensory function as well.
Motor learning changes the activity of cortical motor and subcortical areas of the brain, but does learning affect sensory systems as well? We examined in humans the effects of motor learning using fMRI measures of functional connectivity under resting conditions, and found persistent changes in networks involving both motor and somatosensory areas of the brain. We developed a technique that allows us to distinguish changes in functional connectivity that can be attributed to motor learning from those that are related to perceptual changes that occur in conjunction with learning. Using this technique, we identified a new network in motor learning involving second somatosensory cortex, ventral premotor cortex and supplementary motor cortex whose activation is specifically related to perceptual changes that occur in conjunction with motor learning. We also found changes in a network comprising cerebellar cortex, primary motor and dorsal premotor cortex that were linked to the motor aspects of learning. In each network, we observed highly reliable linear relationships between neuroplastic changes and behavioral measures of either motor learning or perceptual function. Motor learning thus results in functionally specific changes to distinct resting-state networks in the brain.
Recently, Criscimagna-Hemminger et al. (2003) reported a pattern of generalization of force-field adaptation between arms that differs from the pattern that occurs across different configurations of the same arm. Although the intralimb pattern of generalization points to an intrinsic encoding of dynamics, the interlimb transfer described by these authors indicates that information about force is represented in a frame of reference external to the body. In the present study, subjects adapted to a viscous curl-field in two experimental conditions. In one condition, the field was introduced suddenly and produced clear deviations in hand paths; in the second condition, the field was introduced gradually so that at no point during the adaptation process could subjects observe or did they have to correct for a substantial kinematic error. In the first case, a pattern of interlimb transfer consistent with Criscimagna-Hemminger et al. (2003) was observed, whereas no transfer of learning between limbs occurred in the second condition. The findings suggest that there is limited transfer of fine compensatory-force adjustment between limbs. Transfer, when it does occur, may be primarily the result of a cognitive strategy that arises as a result of the sudden introduction of load and associated kinematic error.
The hypothesis that speech goals are defined acoustically and maintained by auditory feedback is a central idea in speech production research. An alternative proposal is that speech production is organized in terms of control signals that subserve movements and associated vocal-tract configurations. Indeed, the capacity for intelligible speech by deaf speakers suggests that somatosensory inputs related to movement play a role in speech production-but studies that might have documented a somatosensory component have been equivocal. For example, mechanical perturbations that have altered somatosensory feedback have simultaneously altered acoustics. Hence, any adaptation observed under these conditions may have been a consequence of acoustic change. Here we show that somatosensory information on its own is fundamental to the achievement of speech movements. This demonstration involves a dissociation of somatosensory and auditory feedback during speech production. Over time, subjects correct for the effects of a complex mechanical load that alters jaw movements (and hence somatosensory feedback), but which has no measurable or perceptible effect on acoustic output. The findings indicate that the positions of speech articulators and associated somatosensory inputs constitute a goal of speech movements that is wholly separate from the sounds produced.
Somatosensory signals from the facial skin and muscles of the vocal tract provide a rich source of sensory input in speech production. We show here that the somatosensory system is also involved in the perception of speech. We use a robotic device to create patterns of facial skin deformation that would normally accompany speech production. We find that when we stretch the facial skin while people listen to words, it alters the sounds they hear. The systematic perceptual variation we observe in conjunction with speech-like patterns of skin stretch indicates that somatosensory inputs affect the neural processing of speech sounds and shows the involvement of the somatosensory system in the perceptual processing in speech.multisensory integration ͉ speech production
During multijoint limb movements such as reaching, rotational forces arise at one joint due to the motions of limb segments about other joints. We report the results of three experiments in which we assessed the extent to which control signals to muscles are adjusted to counteract these "interaction torques." Human subjects performed single- and multijoint pointing movements involving shoulder and elbow motion, and movement parameters related to the magnitude and direction of interaction torques were manipulated systematically. We examined electromyographic (EMG) activity of shoulder and elbow muscles and, specifically, the relationship between EMG activity and joint interaction torque. A first set of experiments examined single-joint movements. During both single-joint elbow (experiment 1) and shoulder (experiment 2) movements, phasic EMG activity was observed in muscles spanning the stationary joint (shoulder muscles in experiment 1 and elbow muscles in experiment 2). This muscle activity preceded movement and varied in amplitude with the magnitude of upcoming interaction torque (the load resulting from motion of the nonstationary limb segment). In a third experiment, subjects performed multijoint movements involving simultaneous motion at the shoulder and elbow. Movement amplitude and velocity at one joint were held constant, while the direction of movement about the other joint was varied. When the direction of elbow motion was varied (flexion vs. extension) and shoulder kinematics were held constant, EMG activity in shoulder muscles varied depending on the direction of elbow motion (and hence the sign of the interaction torque arising at the shoulder). Similarly, EMG activity in elbow muscles varied depending on the direction of shoulder motion for movements in which elbow kinematics were held constant. The results from all three experiments support the idea that central control signals to muscles are adjusted, in a predictive manner, to compensate for interaction torques-loads arising at one joint that depend on motion about other joints.
The idea that humans learn and maintain accurate speech by carefully monitoring auditory feedback is widely held. But this view neglects the fact that auditory feedback is highly correlated with somatosensory feedback during speech production. Somatosensory feedback from speech movements could be a primary means by which cortical speech areas monitor the accuracy of produced speech. We tested this idea by placing the somatosensory and auditory systems in competition during speech motor learning. To do this, we combined two speech learning paradigms to simultaneously alter somatosensory and auditory feedback in real-time as subjects spoke. Somatosensory feedback was manipulated by using a robotic device that altered the motion path of the jaw. Auditory feedback was manipulated by changing the frequency of the first formant of the vowel sound and playing back the modified utterance to the subject through headphones. The amount of compensation for each perturbation was used as a measure of sensory reliance. All subjects were observed to correct for at least one of the perturbations, but auditory feedback was not dominant. Indeed, some subjects showed a stable preference for either somatosensory or auditory feedback during speech.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.