Facial expressions of emotion are dynamic in nature, but most studies on the visual strategies underlying the recognition of facial emotions have used static stimuli. The present study directly compared the visual strategies underlying the recognition of static and dynamic facial expressions using eye tracking and the Bubbles technique. The results revealed different eye fixation patterns with the 2 kinds of stimuli, with fewer fixations on the eye and mouth area during the recognition of dynamic than static expressions. However, these differences in eye fixations were not accompanied by any systematic differences in the facial information that was actually processed to recognize the expressions. (PsycINFO Database Record
Background: The face as a visual stimulus is a reliable source of information for judging the pain experienced by others. Until now, most studies investigating the facial expression of pain have used a descriptive method (i.e. Facial Action Coding System). However, the facial features that are relevant for the observer in the identification of the expression of pain remain largely unknown despite the strong medical impact that misjudging pain can have on patients' well-being. Methods: Here, we investigated this question by applying the Bubbles method. Fifty healthy volunteers were asked to categorize facial expressions (the six basic emotions, pain and neutrality) displayed in stimuli obtained from a previously validated set and presented for 500 ms each. To determine the critical areas of the face used in this categorization task, the faces were partly masked based on random sampling of regions of the stimuli at different spatial frequency ranges. Results: Results show that accurate pain discrimination relies mostly on the frown lines and the mouth. Finally, an ideal observer analysis indicated that the use of the frown lines in human observers could not be attributed to the objective 'informativeness' of this area. Conclusions: Based on a recent study suggesting that this area codes for the affective dimension of pain, we propose that the visual system has evolved to focus primarily on the facial cues that signal the aversiveness of pain, consistent with the social role of facial expressions in the communication of potential threats.
No abstract
The objective of this study was to examine the influence of variations in contextual features of a physically demanding lifting task on the judgments of others' pain. Healthy undergraduates (n=98) were asked to estimate the pain experience of chronic pain patients who were filmed while lifting canisters at different distances from their body. Of interest was whether contextual information (i.e., lifting posture) contributed to pain estimates beyond the variance accounted for by pain behavior. Results indicated that the judgments of others' pain varied significantly as a function of the contextual features of the pain-eliciting task; observers estimated significantly more pain when watching patients lifting canisters positioned further away from the body than canisters closest from the body. Canister position contributed significant unique variance to the prediction of pain estimates even after controlling for observers' use of pain behavior as a basis of pain estimates. Correlational analyses revealed that greater use of the contextual features when judging others' pain was related to a lower discrepancy (higher accuracy) between estimated and self-reported pain ratings. Results also indicated that observers' level of catastrophizing was associated with more accurate pain estimates. The results of a regression analysis further showed that observers' level of catastrophizing contributed to the prediction of the accuracy of pain estimates over and above the variance accounted for by the utilisation of contextual features. Discussion addresses the processes that might underlie the utilisation of contextual features of a pain-eliciting task when estimating others' pain.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.