The sense of body-ownership relies on the representation of both interoceptive and exteroceptive signals coming from one's body. However, it remains unknown how the integration of bodily signals coming from "outside" and "inside" the body is instantiated in the brain. Here, we used a modified version of the Enfacement Illusion to investigate whether the integration of visual and cardiac information can alter self-face recognition (Experiment 1) and neural responses to heartbeats (Experiment 2). We projected a pulsing shade, that was synchronous or asynchronous with the participant's heartbeat, onto a picture depicting the participant's face morphed with the face of an unfamiliar other. Results revealed that synchronous (vs. asynchronous) cardio-visual stimulation led to increased self-identification with the other's face (Experiment 1), while during stimulation, synchronicity modulated the amplitude of the Heartbeat Evoked Potential, an electrophysiological index of cortical interoceptive processing (Experiment 2). Importantly, the magnitude of the illusion-related effects was dependent on, and increased linearly, with the participants' Interoceptive Accuracy. These results provide the first direct neural evidence for the integration of interoceptive and exteroceptive signals in bodily self-awareness.
Current models of face perception propose that initial visual processing is followed by activation of nonvisual somatosensory areas that contributes to emotion recognition. To test whether there is a pure and independent involvement of somatosensory cortex (SCx) during face processing over and above visual responses, we directly measured participants' somatosensory-evoked activity by tactually probing (105 ms postvisual facial stimuli) the state of SCx during an emotion discrimination task while controlling for visual effects. Discrimination of emotional versus neutral expressions enhanced early somatosensory-evoked activity between 40 and 80 ms after stimulus onset, suggesting visual emotion processing in SCx. This effect was source localized within primary, secondary, and associative somatosensory cortex. Emotional face processing influenced somatosensory responses to both face (congruent body part) and finger (control site) tactile stimulation, suggesting a general process that includes nonfacial cortical representations. Gender discrimination of the same facial expressions did not modulate somatosensory-evoked activity. We provide novel evidence that SCx activation is not a byproduct of visual processing but is independently shaped by face emotion processing.
The perception of internal bodily signals (interoception) is central to many theories of emotion and embodied cognition. According to recent theoretical views, the sensory processing of visceral signals such as one's own heartbeat is determined by top-down predictions about the expected interoceptive state of the body (interoceptive inference). In this EEG study we examined neural responses to heartbeats following expected and unexpected emotional stimuli. We used a modified stimulus repetition task in which pairs of facial expressions were presented with repeating or alternating emotional content, and we manipulated the emotional valence and the likelihood of stimulus repetition. We found that affective predictions of external socially relevant information modulated the heartbeat-evoked potential, a marker of cardiac interoception. Crucially, the HEP changes highly relied on the expected emotional content of the facial expression. Thus, expected negative faces led to a decreased HEP amplitude, whereas such an effect was not observed after an expected neutral face. These results suggest that valence-specific affective predictions, and their uniquely associated predicted bodily sensory state, can reduce or amplify cardiac interoceptive responses. In addition, the affective repetition effects were dependent on repetition probability, highlighting the influence of top-down exteroceptive predictions on interoception. Our results are in line with recent models of interoception supporting the idea that predicted bodily states influence sensory processing of salient external information.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.