This perspective review focuses on the proposal that predictive multisensory integration occurring in one’s peripersonal space (PPS) supports individuals’ ability to efficiently interact with others, and that integrating sensorimotor signals from the interacting partners leads to the emergence of a shared representation of the PPS. To support this proposal, we first introduce the features of body and PPS representations that are relevant for interpersonal motor interactions. Then, we highlight the role of action planning and execution on the dynamic expansion of the PPS. We continue by presenting evidence of PPS modulations after tool use and review studies suggesting that PPS expansions may be accounted for by Bayesian sensory filtering through predictive coding. In the central section, we describe how this conceptual framework can be used to explain the mechanisms through which the PPS may be modulated by the actions of our interaction partner, in order to facilitate interpersonal coordination. Last, we discuss how this proposal may support recent evidence concerning PPS rigidity in Autism Spectrum Disorder (ASD) and its possible relatioship with ASD individuals’ difficulties during interpersonal coordination. Future studies will need to clarify the mechanisms and neural underpinning of these dynamic, interpersonal modulations of the PPS.
Consistent with current models of embodied emotions, this study investigates whether the somatosensory system shows reduced sensitivity to facial emotional expressions in autistic compared with neurotypical individuals, and whether these differences are independent from between-group differences in visual processing of facial stimuli. To investigate the dynamics of somatosensory activity over and above visual carryover effects, we recorded EEG activity from two groups of autism spectrum disorder (ASD) or typically developing (TD) humans (male and female), while they were performing a facial emotion discrimination task and a control gender task. To probe the state of the somatosensory system during face processing, in 50% of trials we evoked somatosensory activity by delivering task-irrelevant tactile taps on participants' index finger, 105 ms after visual stimulus onset. Importantly, we isolated somatosensory from concurrent visual activity by subtracting visual responses from activity evoked by somatosensory and visual stimuli. Results revealed significant task-dependent group differences in mid-latency components of somatosensory evoked potentials (SEPs). ASD participants showed a selective reduction of SEP amplitudes (P100) compared with TD during emotion task; and TD, but not ASD, showed increased somatosensory responses during emotion compared with gender discrimination. Interestingly, autistic traits, but not alexithymia, significantly predicted SEP amplitudes evoked during emotion, but not gender, task. Importantly, we did not observe the same pattern of group differences in visual responses. Our study provides direct evidence of reduced recruitment of the somatosensory system during emotion discrimination in ASD and suggests that this effect is not a byproduct of differences in visual processing.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.