In our natural world, a face is usually encountered not as an isolated object but as an integrated part of a whole body. The face and the body both normally contribute in conveying the emotional state of the individual. Here we show that observers judging a facial expression are strongly influenced by emotional body language. Photographs of fearful and angry faces and bodies were used to create face-body compound images, with either matched or mismatched emotional expressions. When face and body convey conflicting emotional information, judgment of facial expression is hampered and becomes biased toward the emotion expressed by the body. Electrical brain activity was recorded from the scalp while subjects attended to the face and judged its emotional expression. An enhancement of the occipital P1 component as early as 115 ms after presentation onset points to the existence of a rapid neural mechanism sensitive to the degree of agreement between simultaneously presented facial and bodily emotional expressions, even when the latter are unattended.emotion communication ͉ event-related potentials ͉ visual perception T he face and the body both normally contribute in conveying the emotional state of the individual. Darwin (1) was the first to describe in detail the specific facial and bodily expressions associated with emotions in animals and humans and regarded these expressions as part of emotion-specific adaptive actions. From this vantage point, face and body are part of an integrated whole. Indeed, in our natural world, faces are usually encountered not as isolated objects but as an integrated part of a whole body. Rapid detection of inconsistencies between them is beneficial when rapid adaptive action is required from the observer. To date, there has been no systematic investigation into how facial expressions and emotional body language interact in human observers, and the underlying neural mechanisms are unknown.Here, we investigate the inf luence of emotional body language on the perception of facial expression. We collected behavioral data and simultaneously measured electrical eventrelated potentials (ERP) from the scalp to explore the time course of neuronal processing with a resolution in the order of milliseconds. We predicted that recognition of facial expressions is inf luenced by concurrently presented emotional body language, and that affective information from the face and the body start to interact rapidly. In view of the important adaptive function of perceiving emotional states and previous findings about rapid recognition of emotional signals, we deemed it unlikely that integrated perception of face-body images results from relatively late and slow semantic processes. We hypothesize that the integration of affective information from a facial expression and the accompanying emotional body language is a mandatory automatic process occurring early in the processing stream, which does not require selective attention, thorough visual analysis of individual features, or conscious deliberate evaluatio...
Recent findings indicate that the perceptual processing of fearful expressions in the face can already be initiated around 100-120 ms after stimulus presentation, demonstrating that emotional information of a face can be encoded before the identity of the face is fully recognized. At present it is not clear whether fear signals from body expressions may be encoded equally as rapid. To answer this question we investigated the early temporal dynamics of perceiving fearful body expression by measuring EEG. Participants viewed images of whole body actions presented either in a neutral or a fearful version. We observed an early emotion effect on the P1 peak latency around 112 ms post stimulus onset hitherto only found for facial expressions. Also consistent with the majority of facial expression studies, the N170 component elicited by perceiving bodies proved not to be sensitive for the expressed fear. In line with previous work, its vertex positive counterpart, the VPP, did show a condition-specific influence for fearful body expression. Our results indicate that the information provided by fearful body expression is already encoded in the early stages of visual processing, and suggest that similar early processing mechanisms are involved in the perception of fear in faces and bodies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.