Evidence from patients has shown that primary somatosensory representations are plastic, dynamically changing in response to central or peripheral alterations, as well as experience. Furthermore, recent research has also demonstrated that altering body posture results in changes in the perceived sensation and localization of tactile stimuli. Using evidence from behavioral studies with brain damaged and healthy subjects, as well as functional imaging, we propose that the traditional concept of the body schema should be divided into three components. First are primary somatosensory representations, which are representations of the skin surface that are typically somatotopically organized, and have been shown to change dynamically due to peripheral (usage, amputation, deafferentation) or central (lesion) modifications. Second, we argue for a mapping from a primary somatosensory representation to a secondary representation of body size and shape (body form representation). Finally, we review evidence for a third set of representations that encodes limb position and is used to represent the location of tactile stimuli relative to the subject using external, non-somatotopic reference frames (postural representations). From Maps to Skin to Space -Touch and Body RepresentationsInformation regarding body position in space comes from tactile, proprioceptive, visual, vestibular, auditory and enteroceptive sources. These inputs are integrated to generate representations of the body that are crucial for perception and action. Head and Holmes (1911) introduced the concept of multiple integrated body representations, dividing them into three categories -a postural schema that represents the position of the body in space before and after movement, a superficial schema used to localize the position of sensation on the body surface, (both which form an unconscious body schema), and a conscious representation known as the body image. Later characterizations of body representations focused primarily on the conscious/unconscious distinction in the body schema and body image (see Gallagher, 1986;Gallagher, 2005; Paillard, 1999). However, this conscious/unconscious dichotomy is likely to be overly simplistic in characterizing body representations (for a discussion, see de Vignemont, this issue; Gallese & Sinigaglia, this issue). We propose to use evidence from studies of tactile perception to provide a theoretical framework for understanding body representations. We argue that representations of the body used in sensory and motor processing (i.e. the body schema as described in Schwoebel & Coslett, 2005) can be divided into three distinct representations used to localize tactile stimuli and interact with the environment 1 .
There is evidence for different levels of visuospatial processing with their own frames of reference: viewer-centered, stimulus-centered, and object-centered. The neural locus of these levels can be explored by examining lesion location in subjects with unilateral spatial neglect (USN) manifest in these reference frames. Most studies regarding the neural locus of USN have treated it as a homogenous syndrome, resulting in conflicting results. In order to further explore the neural locus of visuospatial processes differentiated by frame of reference, we presented a battery of tests to 171 subjects within 48 hr after right supratentorial ischemic stroke before possible structural and/or functional reorganization. The battery included MR perfusion weighted imaging (which shows hypoperfused regions that may be dysfunctional), diffusion weighted imaging (which reveals areas of infarct or dense ischemia shortly after stroke onset), and tests designed to disambiguate between various types of neglect. Results were consistent with a dorsal/ventral stream distinction in egocentric/allocentric processing. We provide evidence that portions of the dorsal stream of visual processing, including the right supramarginal gyrus, are involved in spatial encoding in egocentric coordinates, whereas parts of the ventral stream (including the posterior inferior temporal gyrus) are involved in allocentric encoding.
Mental motor imagery is subserved by the same cognitive systems that underlie action. In turn, action is informed by the anticipated sensory consequences of movement, including pain. In light of these considerations, one would predict that motor imagery would provide a useful measure pain-related functional interference. We report a study in which 19 patients with chronic musculoskeletal or radiculopathic arm or shoulder pain, 24 subjects with chronic pain not involving the arm/shoulder and 41 normal controls were asked to indicate if a line drawing was a right or left hand. Previous work demonstrated that this task is performed by mental rotation of the subject’s hand to match the stimulus. Relative to normal and pain control subjects, arm/shoulder pain subjects were significantly slower for stimuli that required greater amplitude rotations. For the arm/shoulder pain subjects only there was a correlation between degree of slowing and the rating of severity of pain with movement but not the non-specific pain rating. The hand laterality task may supplement the assessment of subjects with chronic arm/shoulder pain.
Several lines of evidence suggest that mental motor imagery is subserved by the same cognitive operations and brain structures that underlie action. Additionally, motor imagery is informed by the anticipated sensory consequences of action, including pain. We reasoned that motor imagery could provide a useful measure of chronic leg or foot pain. Forty subjects with leg pain (19 bilateral, 11 right, and 10 left leg pain), 42 subjects with chronic pain not involving the legs, and 38 controls were shown 12 different line drawings of the right or left foot and asked to indicate which foot was depicted. Previous work suggests that subjects perform this task by mentally rotating their foot to match the visually presented stimulus. All groups of subjects were slower and less accurate with stimuli that required a greater degree of mental rotation of their foot. Subjects with leg pain were both slower and less accurate than normal and pain control subjects in responding to drawings of a painful extremity. Furthermore, subjects with leg pain exhibited a significantly greater decrement in performance for stimuli that required larger amplitude mental rotations. These data suggest that motor imagery may provide important insights into the nature of the pain experience.
Background Loss of fluency is a significant source of functional impairment in many individuals with aphasia. Repetitive transcranial magnetic stimulation (rTMS) administered to the right inferior frontal gyrus (IFG) has been shown to facilitate naming in persons with chronic left hemisphere stroke and non-fluent aphasia. However, changes in fluency in aphasic subjects receiving rTMS have not been adequately explored. Aims To determine whether rTMS improves fluency in individuals with chronic nonfluent aphasia, and to identify aspects of fluency that are modulated in persons who respond to rTMS. Methods & Procedures Ten individuals with left hemisphere MCA strokes and mild to moderate non-fluent aphasia participated in the study. Before treatment, subjects were asked to describe the Cookie Theft picture in three separate sessions. During treatment, all subjects received 1200 pulses of 1 Hz rTMS daily in 10 sessions over two weeks at a site that had previously been shown to improve naming. Subjects repeated the Cookie Theft description two months after treatment. Five subjects initially received sham stimulation instead of real TMS. Two months after sham treatment, these individuals received real rTMS. Performance both at baseline and after stimulation was coded using Quantitative Production Analysis (Saffran, Berndt & Schwartz, 1989) and Correct Information Unit (Nicholas & Brookshire, 1993) analysis. Outcomes & Results Across all subjects (n=10), real rTMS treatment resulted in a significant increase in multiple measures of discourse productivity compared to baseline performance. There was no significant increase in measures of sentence productivity or grammatical accuracy. There was no significant increase from baseline in the sham condition (n=5) on any study measures. Conclusions Stimulation of the right IFG in patients with chronic non-fluent aphasia facilitates discourse production. We posit that this effect may be attributable to improved lexical-semantic access.
We examined the relationship between subcomponents of embodiment and multisensory integration using a mirror box illusion. The participants’ left hand was positioned against the mirror, while their right hidden hand was positioned 12″, 6″, or 0″ from the mirror – creating a conflict between visual and proprioceptive estimates of limb position in some conditions. After synchronous tapping, asynchronous tapping, or no movement of both hands, participants gave position estimates for the hidden limb and filled out a brief embodiment questionnaire. We found a relationship between different subcomponents of embodiment and illusory displacement towards the visual estimate. Illusory visual displacement was positively correlated with feelings of deafference in the asynchronous and no movement conditions, whereas it was positive correlated with ratings of visual capture and limb ownership in the synchronous and no movement conditions. These results provide evidence for dissociable contributions of different aspects of embodiment to multisensory integration.
Representing the locations of tactile stimulation can involve somatotopic reference frames in which locations are defined relative to a position on the skin surface, and also external reference frames which take into account stimulus position in external space. Locations in somatotopic and external reference frames can conflict in terms of left/right assignment when the hands are crossed or positioned outside of their typical hemispace. To investigate the spatial codes of the representation of both tactile stimuli and responses to touch, a Simon effect task, often used in the visual modality to examine issues of spatial reference frames, was deployed in the tactile modality. Participants performed the task with stimuli delivered to the hands with arms in crossed or uncrossed postures and responses were produced with foot pedals. Across all four experiments, participants were faster on somatotopically congruent trials (e.g., left hand stimulus, left foot response) than on somatotopically incongruent trials (left hand stimulus, right foot response) regardless of arm or leg position. However, some evidence of an externally-based Simon effect also appeared in one experiment in which arm (stimulus) and leg (response) position were both manipulated. Overall, the results demonstrate that tactile stimulus and response codes are primarily generated based on their somatotopic identity. However, stimulus and response coding based on an external reference frame can become more salient when both hands and feet can be crossed, creating a situation in which somatotopic and external representations can differ for both stimulus and response codes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.