Virtual reality (VR) has made its way into mainstream psychological research in the last two decades. This technology, with its unique ability to simulate complex, real situations and contexts, offers researchers unprecedented opportunities to investigate human behavior in well controlled designs in the laboratory. One important application of VR is the investigation of pathological processes in mental disorders, especially anxiety disorders. Research on the processes underlying threat perception, fear, and exposure therapy has shed light on more general aspects of the relation between perception and emotion. Being by its nature virtual, i.e., simulation of reality, VR strongly relies on the adequate selection of specific perceptual cues to activate emotions. Emotional experiences in turn are related to presence, another important concept in VR, which describes the user’s sense of being in a VR environment. This paper summarizes current research into perception of fear cues, emotion, and presence, aiming at the identification of the most relevant aspects of emotional experience in VR and their mutual relations. A special focus lies on a series of recent experiments designed to test the relative contribution of perception and conceptual information on fear in VR. This strand of research capitalizes on the dissociation between perception (bottom–up input) and conceptual information (top-down input) that is possible in VR. Further, we review the factors that have so far been recognized to influence presence, with emotions (e.g., fear) being the most relevant in the context of clinical psychology. Recent research has highlighted the mutual influence of presence and fear in VR, but has also traced the limits of our current understanding of this relationship. In this paper, the crucial role of perception on eliciting emotional reactions is highlighted, and the role of arousal as a basic dimension of emotional experience is discussed. An interoceptive attribution model of presence is suggested as a first step toward an integrative framework for emotion research in VR. Gaps in the current literature and future directions are outlined.
Emotional facial expressions provide critical information for social interactions. Above all, angry faces are assumed to reflect potential social threat. We investigated event-related potentials (ERPs) triggered by natural and artificial faces expressing fear, anger, happiness or no emotion in participants with low and high levels of social anxiety. Overall, artificial faces elicited stronger P100 and N170 responses than natural faces. Additionally, the N170 component was larger for emotional compared to neutral facial expressions. Social anxiety was associated with an enhanced emotional modulation of the early posterior negativity (EPN) in response to fearful and angry facial expressions. Additionally, while the late positive potential (LPP) was larger for emotional than for neutral faces in low socially anxious participants, LPPs of higher socially anxious participants did not differ. LPPs might therefore be enhanced in higher socially anxious participants for both emotional and neutral faces. Furthermore, the modulations of the EPN and LPP were comparable between natural and artificial faces. These results indicate that social anxiety influences early perceptual processing of faces and that artificial faces are suitable for psychophysiological emotion research.
Pain is aversive, but does the cessation of pain (‘relief’) have a reward-like effect? Indeed, fruitflies avoid an odour previously presented before a painful event, but approach an odour previously presented after a painful event. Thus, event-timing may turn punishment to reward. However, is event-timing also crucial in humans who can have explicit cognitions about associations? Here, we show that stimuli associated with pain-relief acquire positive implicit valence but are explicitly rated as aversive. Specifically, the startle response, an evolutionarily conserved defence reflex, is attenuated by stimuli that had previously followed a painful event, indicating implicit positive valence of the conditioned stimulus; nevertheless, participants explicitly evaluate these stimuli as ‘emotionally negative’. These results demonstrate a rift between the implicit and explicit conditioned valence induced by pain relief. They might explain why humans in some cases are attracted by conditioned stimuli despite explicitly judging them as negative.
The current experiment explored the influence of attitudes on facial reactions to emotional faces. The participants’ attitudes (positive, neutral, and negative) towards three types of characters were manipulated by written reports. Afterwards participants saw happy, neutral, and sad facial expressions of the respective characters while their facial muscular reactions (M. Corrugator supercilii and M. Zygomaticus major) were recorded electromyografically. Results revealed facial mimicry reactions to happy and sad faces of positive characters, but less and even incongruent facial muscular reactions to happy and sad faces of negative characters. Overall, the results show that attitudes, formed in a few minutes, and only by reports and not by own experiences, can moderate automatic non-verbal social behavior, i.e. facial mimicry
Facial muscular reactions to avatars' static (neutral, happy, angry) and dynamic (morphs developing from neutral to happy or angry) facial expressions, presented for 1 s each, were investigated in 48 participants. Dynamic expressions led to better recognition rates and higher intensity and realism ratings. Angry expressions were rated as more intense than happy expressions. EMG recordings indicated emotion-specific reactions to happy avatars as reflected in increased M. zygomaticus major and decreased M. corrugator supercilii tension, with stronger reactions to dynamic as compared to static expressions. Although rated as more intense, angry expressions elicited no significant M. corrugator supercilii activation. We conclude that facial reactions to angry and to happy facial expressions hold different functions in social interactions. Further research should vary dynamics in different ways and also include additional emotional expressions.
Our first impression of others is highly influenced by their facial appearance. However, the perception and evaluation of faces is not only guided by internal features such as facial expressions, but also highly dependent on contextual information such as secondhand information (verbal descriptions) about the target person. To investigate the time course of contextual influences on cortical face processing, event-related brain potentials were investigated in response to neutral faces, which were preceded by brief verbal descriptions containing cues of affective valence (negative, neutral, positive) and self-reference (self-related vs. other-related). ERP analysis demonstrated that early and late stages of face processing are enhanced by negative and positive as well as self-relevant descriptions, although faces per se did not differ perceptually. Affective ratings of the faces confirmed these findings. Altogether, these results demonstrate for the first time both on an electrocortical and behavioral level how contextual information modifies early visual perception in a top-down manner.
In interpersonal encounters, individuals often exhibit changes in their own facial expressions in response to emotional expressions of another person. Such changes are often called facial mimicry. While this tendency first appeared to be an automatic tendency of the perceiver to show the same emotional expression as the sender, evidence is now accumulating that situation, person, and relationship jointly determine whether and for which emotions such congruent facial behavior is shown. We review the evidence regarding the moderating influence of such factors on facial mimicry with a focus on understanding the meaning of facial responses to emotional expressions in a particular constellation. From this, we derive recommendations for a research agenda with a stronger focus on the most common forms of encounters, actual interactions with known others, and on assessing potential mediators of facial mimicry. We conclude that facial mimicry is modulated by many factors: attention deployment and sensitivity, detection of valence, emotional feelings, and social motivations. We posit that these are the more proximal causes of changes in facial mimicry due to changes in its social setting.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.