In daily life, our emotions are often elicited by a multimodal environment, mainly visual and auditory stimuli. Therefore, it is crucial to investigate the symmetrical characteristics of emotional responses to pictures and sounds. In this study, we aimed to elucidate the relationship of attentional states to emotional unimodal stimuli (pictures or sounds) and emotional responses by measuring the pupil diameter, which reflects the emotional arousal associated with increased sympathetic activity. Our hypothesis was that the emotional responses to both the image and sound stimuli are symmetrical: emotion might be suppressed when attentional resources are allocated to another stimulus of the same modality as the emotional stimulus-such as a dot presented at the same time as an emotional image, and a beep sound presented at the same time as an emotional sound. In our two experiments, data for 24 participants were analyzed for a pupillary response. In experiment 1, we investigated the relationship of the attentional state with emotional visual stimuli (International Affective Picture System) and emotional responses by using pupillometry. We set four task conditions to modulate the attentional state (emotional task, no task, visual detection task, and auditory detection task). We observed that the velocity of pupillary dilation was faster during the presentation of emotionally arousing pictures compared to that of neutral ones, regardless of the valence of the pictures. Importantly, this effect was not dependent on the task condition. In experiment 2, we investigated the relationship of the attentional state with emotional auditory sounds (International Affective Digitized Sounds) and emotional responses. We observed a trend towards a significant interaction between the stimulus and the task conditions with regard to the velocity of pupillary dilation. In the emotional and auditory detection tasks, the velocity of pupillary dilation was faster with positive and neutral sounds than negative sounds. However, there were no significant differences between the no task and visual detection task conditions. Taken together, the current data reveal that different pupillary responses were elicited to emotional visual and auditory stimuli, at least in
There have been various studies on the effects of emotional visual processing on subsequent non-emotional auditory stimuli. A previous study with EEG has shown that responses to deviant sounds presented after presenting negative pictures collected more attentional resources than those for neutral pictures. To investigate such a compelling between emotional and cognitive processing, this study aimed to examined pupillary responses to an auditory stimulus after a positive, negative, or neutral emotional state was elicited by an emotional image. An emotional image was followed by a beep sound that was either repetitive or unexpected, and the pupillary dilation was measured. As a result, we found that the early component of the pupillary response to the beep sound was larger for negative and positive emotional states than the neutral emotional state, whereas the late component was larger for the positive emotional state than the negative and neutral emotional states. In addition, the peak latency of the pupillary response was earlier for negative than neutral or positive images. Further, to compensate for the disadvantage of low-temporal resolution of the pupillary data, the pupillary responses were deconvoluted and used in the analysis. The deconvolution analysis of pupillary responses confirmed that the responses to beep sound were more likely to be modulated by the emotional state rather than being influenced by the short presentation interval between the images and sounds. These findings suggested that pupil size index modulations in the compelling situation between emotional and cognitive processing.
The change of facial color and expression reflects our mental or physical condition. Previous behavioral studies indicated that there is a strong interaction between facial color and expression perception. This study investigated the contribution of facial color to expression recognition in blur images with the measurement of behavior and pupillary change. In the experiment, the face stimuli of facial colors (natural color, reddish) with different expressions (neutral, and anger) in 3 blur levels were presented. Participants performed a task of expression identification to the stimulus. Behavioral results indicated that the facial color has a significant contribution to expression recognition as blur level increases. Then, the results of pupillometry showed that the reddishcolor provided the information necessary to identify anger. These results showed the contribution of facial color increases in both psychophysics and pupillary experiment as blur level increases, which suggested that facial color emphasizes the characteristics of specific facial expression.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.