Most studies investigating speeded orientation towards threat have used manual responses. By measuring orienting behaviour using eye movements a more direct and ecologically valid measure of attention can be made. Here, we used a forced-choice saccadic and manual localization task to investigate the speed of discrimination for fearful and neutral body and face images. Fearful/neutral body or face pairs were bilaterally presented for either 20 or 500 ms. Results showed faster saccadic orienting to fearful body and face emotions compared with neutral only at the shortest presentation time (20 ms). For manual responses, faster discrimination of fearful bodies and faces was observed only at the longest duration (500 ms). More errors were made when localizing neutral targets, suggesting that fearful bodies and faces may have captured attention automatically. Results were not attributable to low-level image properties as no threat bias, in terms of reaction time or accuracy, was observed for inverted presentation. Taken together, the results suggest faster localization of threat conveyed both by the face and the body within the oculomotor system. In addition, enhanced detection of fearful body postures suggests that we can readily recognize threat-related information conveyed by body postures in the absence of any face cues.
According to theories of emotion and attention, we are predisposed to orient rapidly toward threat. However, previous examination of attentional cueing by threat showed no enhanced capture at brief durations, a finding that may be related to the sensitivity of the manual response measure used. Here we investigated the time course of orienting attention toward fearful faces in the exogenous cueing task. Cue duration (20 ms or 100 ms) and response mode (saccadic or manual) were manipulated. In the saccade mode, both enhanced attentional capture and impaired disengagement from fearful faces were evident and limited to 20 ms, suggesting that saccadic cueing effects emerge rapidly and are short lived. In the manual mode, fearful faces impacted only upon the disengagement component of attention at 100 ms, suggesting that manual cueing effects emerge over longer periods of time. Importantly, saccades could reveal threat biases at brief cue durations consistent with current theories of emotion and attention.
Three experiments investigated whether emotional information influences perceptual dominance during binocular rivalry. In Experiment 1, rival emotional and neutral faces in the background were coupled with grating stimuli in the foreground. Results showed that gratings paired with emotional faces dominated over those paired with neutral faces. In Experiment 2, emotional and neutral faces were presented dichoptically, without being paired with other stimuli. Dominance of emotional faces was observed. Fusion and low-level image differences were ruled out by examining dominance periods of upright and inverted emotional and neutral faces presented as face-house pairs (Experiment 3). Here, face stimuli dominated over house stimuli only for upright face conditions. In addition, upright emotional faces were perceived for significantly longer durations than upright neutral faces. The results provide further support for the influence of emotional meaning on binocular rivalry.
Models of attention and emotion assign a special status to the processing of threat. While evidence for threat-related attentional bias in highly anxious individuals is robust, effects in the normal population are mixed. An important explanation for the absence of threat-related attentional bias in nonanxious individuals may relate to the spatial frequency components of stimuli. Here we report behavioral data from two experiments examining the relationship between spatial frequency components of emotional and neutral faces and fast saccadic orienting behavior. In Experiment 1 participants had to saccade toward a single face, filtered to include mostly low, high or broad spatial frequencies (LSF, HSF or BSF), posing a fearful, happy or neutral expression presented for 20 ms in the periphery. At BSF a general emotional effect was found whereby saccadic responses were faster for fearful and happy faces relative to neutral, with no significant differences between fearful and happy faces. At LSF both fearful and happy faces had shorter saccadic latencies in comparison to neutral, demonstrating an emotional bias consistent with the BSF data. However, at LSF fearful faces resulted in significantly faster saccades than happy faces indicating that this bias was stronger for threat-related faces. There was no difference in saccadic responses between any emotions at HSF. Experiment 2 showed that the emotional bias diminished for inverted stimuli suggesting that the results were not attributable to low-level image properties. The findings suggest an overall advantage in the oculomotor system for orientation to emotional stimuli and at LSF in particular, a significantly faster localization of threat conveyed by the face stimuli in all individuals.
According to theories of attention and emotion, threat-related stimuli (e.g., negative facial expressions) capture and hold attention. Despite these theories, previous examination of attentional cueing by threat showed no enhanced capture at brief durations. One explanation for the absence of attentional capture effects may be related to the sensitivity of the manual response measure employed. Here we extended beyond facial expressions and investigated the time course of orienting attention towards fearful body postures in the exogenous cueing task. Cue duration (20, 40, 60, or 100 ms), orientation (upright or inverted), and response mode (saccadic eye movement or manual keypress) were manipulated across three experiments. In the saccade mode, both enhanced attentional capture and impaired disengagement from fearful bodies were evident and limited to rapid cue durations (20 and 40 ms), suggesting that saccadic cueing effects emerge rapidly and are short lived. In the manual mode, fearful bodies impacted only upon the disengagement component of attention at 100 ms, suggesting that manual cueing effects emerge over longer periods of time. No cueing modulation was found for inverted presentation, suggesting that valence, not low-level image confounds, was responsible for the cueing effects. Importantly, saccades could reveal threat biases at brief cue durations consistent with current theories of emotion and attention.
Previous binocular rivalry studies with younger adults have shown that emotional stimuli dominate perception over neutral stimuli. Here we investigated the effects of age on patterns of emotional dominance during binocular rivalry. Participants performed a face/house rivalry task where the emotion of the face (happy, angry, neutral) and orientation (upright, inverted) of the face and house stimuli were varied systematically. Age differences were found with younger adults showing a general emotionality effect (happy and angry faces were more dominant than neutral faces) and older adults showing inhibition of anger (neutral faces were more dominant than angry faces) and positivity effects (happy faces were more dominant than both angry and neutral faces). Age differences in dominance patterns were reflected by slower rivalry rates for both happy and angry compared to neutral face/house pairs in younger adults, and slower rivalry rates for happy compared to both angry and neutral face/house pairs in older adults. Importantly, these patterns of emotional dominance and slower rivalry rates for emotional-face/house pairs disappeared when the stimuli were inverted. This suggests that emotional valence, and not low-level image features, were responsible for the emotional bias in both age groups. Given that binocular rivalry has a limited role for voluntary control, the findings imply that anger suppression and positivity effects in older adults may extend to more automatic tasks.
There is evidence that emotional stimuli capture spatial attention and that visual memory is enhanced for emotional content. Here we examine the relationship between emotional content of stimuli and interactions with spatial memory. To assess spatial memory, a modified version of the Corsi Blocks Task (CBT), utilising emotional stimuli, was employed. In the CBT a series of spatial positions are highlighted and the participant has to repeat these in the order in which they were produced. Results showed that presenting more meaningful stimuli, such as emotional faces (e.g. angry or happy) at the spatial locations in the CBT did not enhance spatial memory span relative to the presentation of neutral stimuli (e.g. neutral faces) or non-image stimuli signified by a change in the luminance of the blocks. In addition, saccadic eye movements performed during retention disrupted spatial memory for all items. This occurred irrespective of whether the item to be remembered was a face, a luminance-defined stimulus or whether the face carried emotional significance. The results were not related to the visibility of the test stimuli as participants recognised the emotion displayed by the faces significantly above chance and rated emotional faces as being more arousing than neutral faces. Changes in the type of emotional stimulus (e.g. fearful faces, emotional schematic faces, spiders or flowers) or encoding (short vs. long) duration did not alter the pattern of results. These findings demonstrate an important dissociation between spatial capture and memory. Although emotional content can modulate orienting behaviour, it appears to be of limited effect on spatial memory.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.