We studied the effect of tactile double simultaneous stimulation (DSS) within and between hands to examine spatial coding of touch at the fingers. Participants performed a go/no-go task to detect a tactile stimulus delivered to one target finger (e.g., right index), stimulated alone or with a concurrent non-target finger, either on the same hand (e.g., right middle finger) or on the other hand (e.g., left index finger=homologous; left middle finger=non-homologous). Across blocks we also changed the unseen hands posture (both hands palm down, or one hand rotated palm-up). When both hands were palm-down DSS interference effects emerged both within and between hands, but only when the non-homologous finger served as non-target. This suggests a clear segregation between the fingers of each hand, regardless of finger side. By contrast, when one hand was palm-up interference effects emerged only within hand, whereas between hands DSS interference was considerably reduced or absent. Thus, between hands interference was clearly affected by changes in hands posture. Taken together, these findings provide behavioral evidence in humans for multiple spatial coding of touch during tactile DSS at the fingers. In particular, they confirm the existence of representational stages of touch that distinguish between body-regions more than body-sides. Moreover, they show that the availability of tactile stimulation side becomes prominent when postural update is required.
Although the somatosensory homunculus is a classically used description of the way somatosensory inputs are processed in the brain, the actual contributions of primary (SI) and secondary (SII) somatosensory cortices to the spatial coding of touch remain poorly understood. We studied adaptation of the fMRI BOLD response in the somatosensory cortex by delivering pairs of vibrotactile stimuli to the finger tips of the index and middle fingers. The first stimulus (adaptor) was delivered either to the index or to the middle finger of the right or left hand, and the second stimulus (test) was always administered to the left index finger. The overall BOLD response evoked by the stimulation was primarily contralateral in SI and was more bilateral in SII. However, our fMRI adaptation approach also revealed that both somatosensory cortices were sensitive to ipsilateral as well as to contralateral inputs. SI and SII adapted more after subsequent stimulation of homologous as compared with nonhomologous fingers, showing a distinction between different fingers. Most importantly, for both somatosensory cortices, this finger-specific adaptation occurred irrespective of whether the tactile stimulus was delivered to the same or to different hands. This result implies integration of contralateral and ipsilateral somatosensory inputs in SI as well as in SII. Our findings suggest that SI is more than a simple relay for sensory information and that both SI and SII contribute to the spatial coding of touch by discriminating between body parts (fingers) and by integrating the somatosensory input from the two sides of the body (hands).
According to current textbook knowledge, the primary somatosensory cortex (SI) supports unilateral tactile representations, whereas structures beyond SI, in particular the secondary somatosensory cortex (SII), support bilateral tactile representations. However, dexterous and well-coordinated bimanual motor tasks require early integration of bilateral tactile information. Sequential processing, first of unilateral and subsequently of bilateral sensory information, might not be sufficient to accomplish these tasks. This view of sequential processing in the somatosensory system might therefore be questioned, at least for demanding bimanual tasks. Evidence from the last 15 years is forcing a revision of this textbook notion. Studies in animals and humans indicate that SI is more than a simple relay for unilateral sensory information and, together with SII, contributes to the integration of somatosensory inputs from both sides of the body. Here, we review a series of recent works from our own and other laboratories in favour of interactions between tactile stimuli on the two sides of the body at early stages of processing. We focus on tactile processing, although a similar logic may also apply to other aspects of somatosensation. We begin by describing the basic anatomy and physiology of interhemispheric transfer, drawing on neurophysiological studies in animals and behavioural studies in humans that showed tactile interactions between body sides, both in healthy and in brain-damaged individuals. Then we describe the neural substrates of bilateral interactions in somatosensation as revealed by neurophysiological work in animals and neuroimaging studies in humans (i.e., functional magnetic resonance imaging, magnetoencephalography, and transcranial magnetic stimulation). Finally, we conclude with considerations on the dilemma of how efficiently integrating bilateral sensory information at early processing stages can coexist with more lateralized representations of somatosensory input, in the context of motor control.
Animal, as well as behavioural and neuroimaging studies in humans have documented integration of bilateral tactile information at the level of primary somatosensory cortex (SI). However, it is still debated whether integration in SI occurs early or late during tactile processing, and whether it is somatotopically organized. To address both the spatial and temporal aspects of bilateral tactile processing we used magnetoencephalography in a tactile repetition-suppression paradigm. We examined somatosensory evoked-responses produced by probe stimuli preceded by an adaptor, as a function of the relative position of adaptor and probe (probe always at the left index finger; adaptor at the index or middle finger of the left or right hand) and as a function of the delay between adaptor and probe (0, 25, or 125 ms). Percentage of response-amplitude suppression was computed by comparing paired (adaptor + probe) with single stimulations of adaptor and probe. Results show that response suppression varies differentially in SI and SII as a function of both spatial and temporal features of the stimuli. Remarkably, repetition suppression of SI activity emerged early in time, regardless of whether the adaptor stimulus was presented on the same and the opposite body side with respect to the probe. These novel findings support the notion of an early and somatotopically organized inter-hemispheric integration of tactile information in SI.
Our body is a unique entity by which we interact with the external world. Consequently, the way we represent our body has profound implications in the way we process and locate sensations and in turn perform appropriate actions. The body can be the subject, but also the object of our experience, providing information from sensations on the body surface and viscera, but also knowledge of the body as a physical object. However, the extent to which different senses contribute to constructing the rich and unified body representations we all experience remains unclear. In this review, we aim to bring together recent research showing important roles for several different sensory modalities in constructing body representations. At the same time, we hope to generate new ideas of how and at which level the senses contribute to generate the different levels of body representations and how they interact. We will present an overview of some of the most recent neuropsychological evidence about multisensory control of pain, and the way that visual, auditory, vestibular and tactile systems contribute to the creation of coherent representations of the body. We will focus particularly on some of the topics discussed in the symposium on Multimodal Contributions to Body Representation held on the 15th International Multisensory Research Forum (2015, Pisa, Italy).
The processing of touch depends of multiple factors, such as the properties of the skin and type of receptors stimulated, as well as features related to the actual configuration and shape of the body itself. A large body of research has focused on the effect that the nature of the stimuli has on tactile processing. Less research, however, has focused on features beyond the nature of the touch. In this review, we focus on some features related to the body that have been investigated for less time and in a more fragmented way. These include the symmetrical quality of the two sides of the body, the postural configuration of the body, as well as the size and shape of different body parts. We will describe what we consider three key aspects: (1) how and at which stages tactile information is integrated between different parts and sides of the body; (2) how tactile signals are integrated with online and stored postural configurations of the body, regarded as priors; (3) and how tactile signals are integrated with representations of body size and shape. Here, we describe how these different body dimensions affect integration of tactile information as well as guide motor behavior by integrating them in a single model of tactile processing. We review a wide range of neuropsychological, neuroimaging, and neurophysiological data and suggest a revised model of tactile integration on the basis of the one proposed previously by Longo et al.
Several recent reports have shown that even healthy adults maintain highly distorted representations of the size and shape of their body. These distortions have been shown to be highly consistent across different study designs and dependent measures. However, previous studies have found that visual judgments of size can be modulated by the experimental instructions used, for example, by asking for judgments of the participant's subjective experience of stimulus size (i.e., apparent instructions) versus judgments of actual stimulus properties (i.e., objective instructions).Previous studies investigating internal body representations have relied exclusively on 'apparent' instructions. Here, we investigated whether apparent versus objective instructions modulate findings of distorted body representations underlying position sense (Exp. 1), tactile distance perception (Exp. 2), as well as the conscious body image (Exp. 3). Our results replicate the characteristic distortions previously reported for each of these tasks and further show that these distortions are not affected by instruction type (i.e., apparent vs. objective). These results show that the distortions measured with these paradigms are robust to differences in instructions and do not reflect a dissociation between perception and belief.3
Much is known about the functional mechanisms involved in visual search. Yet, the fundamental question of whether the visual system can perform different types of visual analysis at different spatial resolutions still remains unsettled. In the visual-attention literature, the distinction between different spatial scales of visual processing corresponds to the distinction between distributed and focused attention. Some authors have argued that singleton detection can be performed in distributed attention, whereas others suggest that even such a simple visual operation involves focused attention. Here we showed that microsaccades were spatially biased during singleton discrimination but not during singleton detection. The results provide support to the hypothesis that some coarse visual analysis can be performed in a distributed attention mode.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.