Although the previous studies have shown that an emotional context may alter touch processing, it is not clear how visual contextual information modulates the sensory signals, and at what levels does this modulation take place. Therefore, we investigated how a toucher’s emotional expressions (anger, happiness, fear, and sadness) modulate touchee’s somatosensory-evoked potentials (SEPs) in different temporal ranges. Participants were presented with tactile stimulation appearing to originate from expressive characters in virtual reality. Touch processing was indexed using SEPs, and self-reports of touch experience were collected. Early potentials were found to be amplified after angry, happy and sad facial expressions, while late potentials were amplified after anger but attenuated after happiness. These effects were related to two stages of emotional modulation of tactile perception: anticipation and interpretation. The findings show that not only does touch affect emotion, but also emotional expressions affect touch perception. The affective modulation of touch was initially obtained as early as 25 ms after the touch onset suggesting that emotional context is integrated to the tactile sensation at a very early stage.
With the advent of consumer grade virtual reality (VR) headsets and physiological measurement devices, new possibilities for mediated social interaction emerge enabling the immersion to environments where the visual features react to the users' physiological activation. In this study, we investigated whether and how individual and interpersonally shared biofeedback (visualised respiration rate and frontal asymmetry of electroencephalography, EEG) enhance synchrony between the users' physiological activity and perceived empathy towards the other during a compassion meditation exercise carried out in a social VR setting. The study was conducted as a laboratory experiment (N=72) employing a Unity3D-based Dynecom immersive social meditation environment and two amplifiers to collect the psychophysiological signals for the biofeedback. The biofeedback on empathy-related EEG frontal asymmetry evoked higher self-reported empathy towards the other user than the biofeedback on respiratory activation, but the perceived empathy was highest when both feedbacks were simultaneously presented. In addition, the participants reported more empathy when there was stronger EEG frontal asymmetry synchronization between the users. The presented results inform the field of affective computing on the possibilities that VR offers for different applications of empathic technologies.
Virtual reality presents an extraordinary platform for multimodal communication. Haptic technologies have been shown to provide an important contribution to this by facilitating co-presence and allowing affective communication. However, the findings of the affective influences rely on studies that have used myriad different types of haptic technology, making it likely that some forms of tactile feedback are more efficient in communicating emotions than others. To find out whether this is true and which haptic technologies are most effective, we measured user experience during a communication scenario featuring an affective agent and interpersonal touch in virtual reality. Interpersonal touch was simulated using two types of vibrotactile actuators and two types of force feedback mechanisms. Self-reports of subjective experience of the agent’s touch and emotions were obtained. The results revealed that, regardless of the agent’s expression, force feedback actuators were rated as more natural and resulted in greater emotional interdependence and a stronger sense of co-presence than vibrotactile touch
Receiving a tender caress from a caregiver or spouse reduces stress and promotes emotional wellbeing, but receiving the same caress from a stranger makes us feel uncomfortable. According to recent neurophysiological findings, we not only react differently to the invited versus uninvited touch but also perceive the touch differently depending on context. A virtual reality experiment was conducted to investigate whether individual differences regarding behavioral inhibition system (BIS) and gender contribute to this affective touch perception. Touch perception was measured directly using self-reports and indirectly using the touch-related orienting response. The results showed that touch perception depended on the emotional expression of the virtual agents. High-arousal approach-related (happiness, anger) and avoidance-related (fear) expressions increased self-reported touch intensity, while happiness reduced the orienting response to touch. Moreover, interpersonal differences in behavioral inhibition and gender played distinct roles: BIS sensitivity in males was associated with stronger affective touch perception, particularly with high-arousal emotions whereas in females BIS sensitivity did not affect touch perception. The results suggest that individual differences that are related to preferences regarding tactile communication also determine how touch is perceived
The tendency to simulate the pain of others within our own sensorimotor systems is a vital component of empathy. However, this sensorimotor resonance is modulated by a multitude of social factors including similarity in bodily appearance, e.g. skin colour. The current study investigated whether increasing self–other similarity via virtual transfer to another colour body reduced ingroup bias in sensorimotor resonance. A sample of 58 white participants was momentarily transferred to either a black or a white body using virtual reality technology. We then employed electroencephalography to examine event-related desynchronization (ERD) in the sensorimotor beta (13–23 Hz) oscillations while they viewed black, white and violet photorealistic virtual agents being touched with a noxious or soft object. While the noxious treatment of a violet agent did not increase beta ERD, amplified beta ERD in response to black agent’s noxious vs soft treatment was found in perceivers transferred to a black body. Transfer to the white body dismissed the effect. Further exploratory analysis implied that the pain-related beta ERD occurred only when the agent and the participant were of the same colour. The results suggest that even short-lasting changes in bodily resemblance can modulate sensorimotor resonance to others’ perceived pain.
Earlier studies have revealed cross-modal visuo-tactile interactions in endogenous spatial attention. The current research used event-related potentials (ERPs) and virtual reality (VR) to identify how the visual cues of the perceiver’s body affect visuo-tactile interaction in endogenous spatial attention and at what point in time the effect takes place. A bimodal oddball task with lateralized tactile and visual stimuli was presented in two VR conditions, one with and one without visible hands, and one VR-free control with hands in view. Participants were required to silently count one type of stimulus and ignore all other stimuli presented in irrelevant modality or location. The presence of hands was found to modulate early and late components of somatosensory and visual evoked potentials. For sensory-perceptual stages, the presence of virtual or real hands was found to amplify attention-related negativity on the somatosensory N140 and cross-modal interaction in somatosensory and visual P200. For postperceptual stages, an amplified N200 component was obtained in somatosensory and visual evoked potentials, indicating increased response inhibition in response to non-target stimuli. The effect of somatosensory, but not visual, N200 enhanced when the virtual hands were present. The findings suggest that bodily presence affects sustained cross-modal spatial attention between vision and touch and that this effect is specifically present in ERPs related to early- and late-sensory processing, as well as response inhibition, but do not affect later attention and memory-related P3 activity. Finally, the experiments provide commeasurable scenarios for the estimation of the signal and noise ratio to quantify effects related to the use of a head mounted display (HMD). However, despite valid a-priori reasons for fearing signal interference due to a HMD, we observed no significant drop in the robustness of our ERP measurements.
Nonverbal communication determines much of how we perceive explicit, verbal messages. Facial expressions and social touch, for example, influence affinity and conformity. To understand the interaction between nonverbal and verbal information, we studied how the psychophysiological time-course of semiotics—the decoding of the meaning of a message—is altered by interpersonal touch and facial expressions. A virtual-reality-based economic decision-making game, ultimatum, was used to investigate how participants perceived, and responded to, financial offers of variable levels of fairness. In line with previous studies, unfair offers evoked medial frontal negativity (MFN) within the N2 time window, which has been interpreted as reflecting an emotional reaction to violated social norms. Contrary to this emotional interpretation of the MFN, however, nonverbal signals did not modulate the MFN component, only affecting fairness perception during the P3 component. This suggests that the nonverbal context affects the late, but not the early, stage of fairness perception. We discuss the implications of the semiotics of the message and the messenger as a process by which parallel information sources of “who says what” are integrated in reverse order: of the message, then the messenger.Electronic supplementary materialThe online version of this article (10.3758/s13415-019-00738-8) contains supplementary material, which is available to authorized users.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.