Studies indicate that people with social anxiety show changes in perception of facial emotion. Here we investigated the recognition of static and dynamic facial expressions in 2 groups varying with regard to scores on the Social Phobia Inventory (SPIN) and classified as having high social anxiety (HSA; SPIN Ն19; n ϭ 22) and low social anxiety (SPIN Ͻ19; n ϭ 21). Facial expressions of happiness, sadness, fear, and anger in dynamic (videos) and static (photos) conditions were presented at 4 intensities (25%, 50%, 75%, and 100%). For each condition, recognition means were analyzed with an ANOVA of model: 2 groups ϫ (2 conditions [static and dynamic] ϫ 4 emotions ϫ 4 intensities). We found an interaction between the factors Group, Condition, Emotion, and Intensity. Post hoc analysis indicated that the HSA group had better scores in the static face of anger with 25% of emotion compared with controls. No difference between groups was found in the dynamic condition. The analysis of the confusion matrix of judgments indicated that the advantage of the participants with social anxiety in the static condition was not explained by a general bias of attributing anger to facial expressions. The results suggest an advantage for individuals with social anxiety to recognize emotions in stimuli with less ecological validity (static faces). The use of dynamic faces may reduce or eliminate the differences between individuals with high and low social anxiety in the recognition of facial emotions.
Social anxiety disorder (SAD) is characterized by the fear of being judged negatively in social situations. Eye-tracking techniques have been prominent among the methods used in recent decades to investigate emotional processing in SAD. This study offers a systematic review of studies on eye-tracking patterns in individuals with SAD and controls in facial emotion recognition tasks. Thirteen articles were selected from the consulted databases. It was observed that the subjects with SAD exhibited hypervigilance-avoidance in response to emotions, primarily in the case of negative expressions. There was avoidance of conspicuous areas of the face, particularly the eyes, during observations of negative expressions. However, this hypervigilance did not occur if the stimulus was presented in virtual reality. An important limitation of these studies is that they use only static expressions, which can reduce the ecological validity of the results.
Facial expressions are especially relevant for deaf people because, in addition to the emotional content, they assume linguistic functions exclusive to sign languages. Objective: We aimed to verify whether the better recognition of facial expressions in deaf people is also maintained for stimuli with greater ecological validity. Method: In the present study, we investigated the recognition of static (photographs) and dynamic (videos) facial expressions in: (a) deaf signers with profound congenital or early acquired deafness (up to 2 years of age); (b) hearing individuals who knew sign language; and (c) hearing individuals who did not know sign language. Facial expressions of joy, sadness, fear, and anger in static and dynamic conditions were presented at 4 intensities (25%, 50%, 75%, and 100%). Results: In the dynamic condition, we found that deaf individuals had lower recognition scores compared with the other 2 groups of hearing individuals (p Ͻ .05). However, in the static condition, no differences between groups were found (p Ͼ .05). Conclusion:Those results indicate that the use of facial expressions in sign language does not necessarily favor emotion recognition, probably because facial expressions have linguistic properties not related to emotional content.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.