Highlights d Human sensorimotor system rapidly localizes touch on a hand-held tool d Brain responses in a deafferented patient suggest vibrations encode touch location d Somatosensory cortex efficiently extracts touch location from the tool's vibrations d Somatosensory cortex reuses neural processes devoted to mapping touch on the body Authors
Perhaps the most recognizable sensory map in all of neuroscience is the somatosensory homunculus. Though it seems straightforward, this simple representation belies the complex link between an activation in somatosensory Area 3b and the associated touch location on the body. Any isolated activation is spatially ambiguous without a neural decoder that can read its position within the entire map, though how this is computed by neural networks is unknown.We propose that somatosensory cortex implements multilateration, a common computation used by surveying and GPS systems to localize objects. Specifically, to decode touch location on the body, the somatosensory system estimates the relative distance between the afferent input and the body's joints. We show that a simple feedforward neural network which captures the receptive field properties of somatosensory cortex implements a Bayes-optimal multilateral decoder via a combination of bell-shaped (Area 3b) and sigmoidal (Areas 1/2) tuning curves. Simulations demonstrated that this decoder produced a unique pattern of localization variability between two joints that was not produced by other known neural decoders. Finally, we identify this neural signature of multilateration in actual psychophysical experiments, suggesting that it is a candidate computational mechanism underlying tactile localization. take place in the frontal and parietal cortices (Burnod et al., 1999;Crawford et al., 2004;Medendorp et al., 2005;Pesaran et al., 2006).Equally crucial to localizing objects in the environment is localizing objects on the personal space of the body. Despite over 180 years of research on the sense of touch (Weber, 1834), the computations underlying tactile localization remain largely unknown. Recent accounts have suggested that tactile localization requires two computational steps Medina and Coslett, 2010). First, afferent input must be localized within a topographic map in
The sense of touch is not restricted to the body but can also extend to external objects. When we use a handheld tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. Although the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the electroencephalography signal of participants while they localized tactile stimuli on a handheld rod. We focused on oscillatory activity in the alpha (7–14 Hz) and beta (15–30 Hz) ranges, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a handheld rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.
Perhaps the most recognizable sensory map in all of neuroscience is the somatosensory homunculus. Although it seems straightforward, this simple representation belies the complex link between an activation in a somatotopic map and the associated touch location on the body. Any isolated activation is spatially ambiguous without a neural decoder that can read its position within the entire map, but how this is computed by neural networks is unknown. We propose that the somatosensory system implements multilateration, a common computation used by surveying and global positioning systems to localize objects. Specifically, to decode touch location on the body, multilateration estimates the relative distance between the afferent input and the boundaries of a body part (e.g., the joints of a limb). We show that a simple feedforward neural network, which captures several fundamental receptive field properties of cortical somatosensory neurons, can implement a Bayes-optimal multilateral computation. Simulations demonstrated that this decoder produced a pattern of localization variability between two boundaries that was unique to multilateration. Finally, we identify this computational signature of multilateration in actual psychophysical experiments, suggesting that it is a candidate computational mechanism underlying tactile localization.
It is often claimed that tools are embodied by the user, but whether the brain actually repurposes its body-based computations to perform similar tasks with tools is not known. A fundamental body-based computation used by the somatosensory system is trilateration. Here, the location of touch on a limb is computed by integrating estimates of the distance between sensory input and its boundaries (e.g., elbow and wrist of the forearm). As evidence of this computational mechanism, tactile localization on a limb is most precise near its boundaries and lowest in the middle. If the brain repurposes trilateration to localize touch on a tool, we should observe this computational signature in behavior. In a large sample of participants, we indeed found that localizing touch on a tool produced the signature of trilateration, with highest precision close to the base and tip of the tool. A computational model of trilateration provided a good fit to the observed localization behavior. Importantly, model selection demonstrated that trilateration better explained each participant's behavior than an alternative model of localization. These results have important implications for how trilateration may be implemented by somatosensory neural populations. In sum, the present study suggests that tools are indeed embodied at a computational level, repurposing a fundamental spatial computation.
Numerous studies have suggested that tools become incorporated into a representation of our body. A prominent hypothesis suggests that our brain re-uses body-based computations when we use tools. However, little is known about how this is implemented at the neural level. Here we used the ability to localize touch on both tools and body parts as a case study to fill this gap. Neural oscillations in the alpha (8-13 Hz) and beta (15-25 Hz) frequency bands are involved in mapping touch on the body in distinct reference frames. Alpha activity reflects the mapping of touch in external coordinates, whereas beta activity reflects the mapping of touch in skin-centered coordinates. Here, we aimed at pinpointing the role of these oscillations during tool-extended sensing. We recorded participants' oscillatory activity while tactile stimuli were applied to either hands or the tips of hand-held rods. The posture of the hands/tool-tips was uncrossed or crossed at participants' body midline in order for us to disentangle brain responses related to different coordinate systems. We found that alpha-band activity was modulated similarly across postures when localizing touch on hands and on tools, reflecting the position of touch in external space. Source reconstruction also indicated a similar network of cortical regions involved for tools and hands. Our findings strongly suggest that the brain uses similar oscillatory mechanisms for mapping touch on the body and tools, supporting the idea of neural processes being repurposed for tool-use.
The sense of touch is not restricted to the body but can also extend to external objects. When we use a hand-held tool to contact an object, we feel the touch on the tool and not in the hand holding the tool. The ability to perceive touch on a tool actually extends along its entire surface, allowing the user to accurately localize where it is touched similarly as they would on their body. While the neural mechanisms underlying the ability to localize touch on the body have been largely investigated, those allowing to localize touch on a tool are still unknown. We aimed to fill this gap by recording the EEG signal of participants while they localized tactile stimuli on a hand-held rod. We focused on oscillatory activity in the alpha (7-14 Hz) and beta (15-30 Hz) range, as they have been previously linked to distinct spatial codes used to localize touch on the body. Beta activity reflects the mapping of touch in skin-based coordinates, whereas alpha activity reflects the mapping of touch in external space. We found that alpha activity was solely modulated by the location of tactile stimuli applied on a hand-held rod. Source reconstruction suggested that this alpha power modulation was localized in a network of fronto-parietal regions previously implicated in higher-order tactile and spatial processing. These findings are the first to implicate alpha oscillations in tool-extended sensing and suggest an important role for processing touch in external space when localizing touch on a tool.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.