Virtual reality (VR) has made its way into mainstream psychological research in the last two decades. This technology, with its unique ability to simulate complex, real situations and contexts, offers researchers unprecedented opportunities to investigate human behavior in well controlled designs in the laboratory. One important application of VR is the investigation of pathological processes in mental disorders, especially anxiety disorders. Research on the processes underlying threat perception, fear, and exposure therapy has shed light on more general aspects of the relation between perception and emotion. Being by its nature virtual, i.e., simulation of reality, VR strongly relies on the adequate selection of specific perceptual cues to activate emotions. Emotional experiences in turn are related to presence, another important concept in VR, which describes the user’s sense of being in a VR environment. This paper summarizes current research into perception of fear cues, emotion, and presence, aiming at the identification of the most relevant aspects of emotional experience in VR and their mutual relations. A special focus lies on a series of recent experiments designed to test the relative contribution of perception and conceptual information on fear in VR. This strand of research capitalizes on the dissociation between perception (bottom–up input) and conceptual information (top-down input) that is possible in VR. Further, we review the factors that have so far been recognized to influence presence, with emotions (e.g., fear) being the most relevant in the context of clinical psychology. Recent research has highlighted the mutual influence of presence and fear in VR, but has also traced the limits of our current understanding of this relationship. In this paper, the crucial role of perception on eliciting emotional reactions is highlighted, and the role of arousal as a basic dimension of emotional experience is discussed. An interoceptive attribution model of presence is suggested as a first step toward an integrative framework for emotion research in VR. Gaps in the current literature and future directions are outlined.
There is evidence that specific regions of the face such as the eyes are particularly relevant for the decoding of emotional expressions, but it has not been examined whether scan paths of observers vary for facial expressions with different emotional content. In this study, eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions. Locations of fixations and their durations were recorded, and a dominance ratio (i.e., eyes and mouth relative to the rest of the face) was calculated. Across all emotional expressions, initial fixations were most frequently directed to either the eyes or the mouth. Especially in sad facial expressions, participants more frequently issued the initial fixation to the eyes compared with all other expressions. In happy facial expressions, participants fixated the mouth region for a longer time across all trials. For fearful and neutral facial expressions, the dominance ratio indicated that both the eyes and mouth are equally important. However, in sad and angry facial expressions, the eyes received more attention than the mouth. These results confirm the relevance of the eyes and mouth in emotional decoding, but they also demonstrate that not all facial expressions with different emotional content are decoded equally. Our data suggest that people look at regions that are most characteristic for each emotion.
A versatile, ultralight, N‐doped, 3D graphene framework (GF) is prepared. In their Communication on L. Qu and co‐workers show that this GF has an ultra‐low density ((2.1±0.3) mg cm−3; a GF block can balance on a dandelion) and its adsorption capacity for oils is much higher than that of the best carbonaceous sorbents. The 3D open‐pore structure and N doping make GF promising as an electrode material for supercapacitors and as a metal‐free catalyst for the oxygen reduction reaction in fuel cells.
Therapist-guided exposure is more effective for agoraphobic avoidance, overall functioning, and panic attacks in the follow-up period than is CBT without therapist-guided exposure. Therapist-guided exposure promotes additional therapeutic improvement--possibly mediated by increased physical engagement in feared situations--beyond the effects of a CBT treatment in which exposure is simply instructed.
Two incompatible pictures compete for perceptual dominance when they are presented to one eye each. This so-called binocular rivalry results in an alternation of dominant and suppressed percepts. In accordance with current theories of emotion processing, the authors' previous research has suggested that emotionally arousing pictures predominate in this perceptual process. Three experiments were run with pictures of emotional facial expressions that are known to induce emotions while being well controlled in terms of physical characteristics. In Experiment 1, photographs of emotional and neutral facial expressions were presented of the same actor to minimize physical differences. In Experiment 2, schematic emotional expressions were presented to further eliminate low-level differences. In Experiment 3, a probe-detection task was conducted to control for possible response-biases. Together, these data clearly demonstrate that emotional facial expressions predominate over neutral expressions; they are more often the first percept and they are perceived for longer durations. This is not caused by physical stimulus properties or by response-biases. This novel approach supports that emotionally significant visual stimuli are preferentially perceived.
Animal studies have suggested neuropeptide S (NPS) and its receptor (NPSR) to be involved in the pathogenesis of anxiety-related behavior. In this study, a multilevel approach was applied to further elucidate the role of NPS in the etiology of human anxiety. The functional NPSR A/T (Asn 107 Ile) variant (rs324981) was investigated for association with (1) panic disorder with and without agoraphobia in two large, independent case-control studies, (2) dimensional anxiety traits, (3) autonomic arousal level during a behavioral avoidance test and (4) brain activation correlates of anxiety-related emotional processing in panic disorder. The more active NPSR rs324981 T allele was found to be associated with panic disorder in the female subgroup of patients in both samples as well as in a meta-analytic approach. The T risk allele was further related to elevated anxiety sensitivity, increased heart rate and higher symptom reports during a behavioral avoidance test as well as decreased activity in the dorsolateral prefrontal, lateral orbitofrontal and anterior cingulate cortex during processing of fearful faces in patients with panic disorder. The present results provide converging evidence for a female-dominant role of NPSR gene variation in panic disorder potentially through heightened autonomic arousal and distorted processing of anxiety-relevant emotional stimuli.
Visual emotional stimuli evoke enhanced activation in early visual cortex areas which may help organisms to quickly detect biologically salient cues and initiate appropriate approach or avoidance behavior. Functional neuroimaging evidence for the modulation of other sensory modalities by emotion is scarce. Therefore, the aim of the present study was to test whether sensory facilitation by emotional cues can also be found in the auditory domain. We recorded auditory brain activation with functional near-infrared-spectroscopy (fNIRS), a non-invasive and silent neuroimaging technique, while participants were listening to standardized pleasant, unpleasant, and neutral sounds selected from the International Affective Digitized Sound System (IADS). Pleasant and unpleasant sounds led to increased auditory cortex activation as compared to neutral sounds. This is the first study to suggest that the enhanced activation of sensory areas in response to complex emotional stimuli is apparently not restricted to the visual domain but is also evident in the auditory domain.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.