We used functional magnetic resonance imaging (fMRI) to map the cortical representations of executed reaching, observed reaching, and imagined reaching in humans. Whereas previous studies have mostly examined hand actions related to grasping, hand-object interactions, or local finger movements, here we were interested in reaching only (i.e. the transport phase of the hand to a particular location in space), without grasping. We hypothesized that mirror neuron areas specific to reaching-related representations would be active in all three conditions. An overlap between executed, observed, and imagined reaching activations was found in dorsal premotor cortex as well as in the superior parietal lobe and the intraparietal sulcus, in accord with our hypothesis. Activations for observed reaching were more dorsal than activations typically reported in the literature for observation of hand-object interactions (grasping). Our results suggest that the mirror neuron system is specific to the type of hand action performed, and that these fronto-parietal activations are a putative human homologue of the neural circuits underlying reaching in macaques. The parietal activations reported here for executed, imagined, and observed reaching are also consistent with previous functional imaging studies on planned reaching and delayed pointing movements, and extend the proposed localization of human reach-related brain areas to observation as well as imagery of reaching.
Reaching toward a visual target involves at least two sources of information. One is the visual feedback from the hand as it approaches the target. Another is proprioception from the moving limb, which informs the brain of the location of the hand relative to the target even when the hand is not visible. Where these two sources of information are represented in the human brain is unknown. In the present study, we investigated the cortical representations for reaching with or without visual feedback from the moving hand, using functional magnetic resonance imaging. To identify reach-dominant areas, we compared reaching with saccades. Our results show that a reach-dominant region in the anterior precuneus (aPCu), extending into medial intraparietal sulcus, is equally active in visual and nonvisual reaching. A second region, at the superior end of the parieto-occipital sulcus (sPOS), is more active for visual than for nonvisual reaching. These results suggest that aPCu is a sensorimotor area whose sensory input is primarily proprioceptive, while sPOS is a visuomotor area that receives visual feedback during reaching. In addition to the precuneus, medial, anterior intraparietal, and superior parietal cortex were also activated during both visual and nonvisual reaching, with more anterior areas responding to hand movements only and more posterior areas responding to both hand and eye movements. Our results suggest that cortical networks for reaching are differentially activated depending on the sensory conditions during reaching. This indicates the involvement of multiple parietal reach regions in humans, rather than a single homogenous parietal reach region.
In primates, control of the limb depends on many cortical areas. Whereas specialized parietofrontal circuits have been proposed for different movements in macaques, functional neuroimaging in humans has revealed widespread, overlapping activations for hand and eye movements and for movements such as reaching and grasping. This review examines the involvement of frontal and parietal areas in hand and arm movements in humans as revealed with functional neuroimaging. The degree of functional specialization, possible homologies with macaque cortical regions, and differences between frontal and posterior parietal areas are discussed, as well as a possible organization of hand movements with respect to different spatial reference frames. The available evidence supports a cortical organization along gradients of sensory (visual to somatosensory) and effector (eye to hand) preferences.
The extent to which different cognitive processes are "embodied" is widely debated. Previous studies have implicated sensorimotor regions such as lateral intraparietal (LIP) area in perceptual decision making. This has led to the view that perceptual decisions are embodied in the same sensorimotor networks that guide body movements. We use event-related fMRI and effective connectivity analysis to investigate whether the human sensorimotor system implements perceptual decisions. We show that when eye and hand motor preparation is disentangled from perceptual decisions, sensorimotor areas are not involved in accumulating sensory evidence toward a perceptual decision. Instead, inferior frontal cortex increases its effective connectivity with sensory regions representing the evidence, is modulated by the amount of evidence, and shows greater task-positive BOLD responses during the perceptual decision stage. Once eye movement planning can begin, however, an intraparietal sulcus (IPS) area, putative LIP, participates in motor decisions. Moreover, sensory evidence levels modulate decision and motor preparation stages differently in different IPS regions, suggesting functional heterogeneity of the IPS. This suggests that different systems implement perceptual versus motor decisions, using different neural signatures.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.