2020
DOI: 10.1093/scan/nsaa110
|View full text |Cite
|
Sign up to set email alerts
|

Statistical pattern recognition reveals shared neural signatures for displaying and recognizing specific facial expressions

Abstract: Human neuroimaging and behavioural studies suggest that somatomotor ‘mirroring’ of seen facial expressions may support their recognition. Here we show that viewing specific facial expressions triggers the representation corresponding to that expression in the observer’s brain. Twelve healthy female volunteers underwent two separate fMRI sessions: one where they observed and another where they displayed three types of facial expressions (joy, anger and disgust). Pattern classifier based on Bayesian logistic reg… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
14
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1

Relationship

2
7

Authors

Journals

citations
Cited by 20 publications
(15 citation statements)
references
References 50 publications
1
14
0
Order By: Relevance
“…The accurate and fast recognition of emotions from others' facial expressions is important for effective social interaction and communication (Blair, 2003). It has been proposed that emotions expressed through the face are recognised through a process of embodied simulation (Gallese, 2005), whereby others' expressions activate corresponding sensorimotor representations in the observer's brain, and this is supported by recent neuroimaging evidence (Volynets et al, 2020). The embodied response may involve mimicry, which is the subthreshold activation of facial muscles involved in producing the target expression, as demonstrated in electromyography (EMG) studies (Sato et al, 2008).…”
Section: Emotion Recognition As An Embodied Processmentioning
confidence: 97%
“…The accurate and fast recognition of emotions from others' facial expressions is important for effective social interaction and communication (Blair, 2003). It has been proposed that emotions expressed through the face are recognised through a process of embodied simulation (Gallese, 2005), whereby others' expressions activate corresponding sensorimotor representations in the observer's brain, and this is supported by recent neuroimaging evidence (Volynets et al, 2020). The embodied response may involve mimicry, which is the subthreshold activation of facial muscles involved in producing the target expression, as demonstrated in electromyography (EMG) studies (Sato et al, 2008).…”
Section: Emotion Recognition As An Embodied Processmentioning
confidence: 97%
“…Portrayed emotions have been extracted mainly with manual annotations (Ekman and Oster, 1979;Lahnakoski et al, 2012;Witkower and Tracy, 2019). Emotion cues used in labeling portrayed emotions include, for instance, facial behaviors (e.g., frowning, Volynets et al, 2020), bodily behaviors (e.g., Witkower and Tracy, 2019), verbal and vocal cues (e.g., trembling voice, Ethofer et al, 2009), actions, and contextual cues (Skerry and Saxe, 2014). Perception of emotions operates at an abstract level regardless of the stimulus modality: both inferred (from contextual descriptions) and perceived (from facial expressions) have shared neural codes (Skerry and Saxe, 2014).…”
Section: Portrayed Emotionsmentioning
confidence: 99%
“…However, it is possible that the brain activity patterns measured with affective stimuli reflect also automatic emotional behaviors and inhibition, as suggested by the automatic facial behaviors evoked during emotional movie viewing (Chang et al, 2021). For instance, displaying and observing facial behaviors activate partly overlapping brain regions (Volynets et al, 2020). Similarly, observing actions in movies and real-life automatically activates overlapping motor regions (Nummenmaa et al, 2014c;Smirnov et al, 2017).…”
Section: Emotional Behaviorsmentioning
confidence: 99%
“…Previous classification studies have addressed the importance of these regions for distinct social perceptual tasks. For example, faces, animals and objects have been successfully classified by respective brain activation patterns in areas within FG and LOTC 20 , and highest classification accuracy for facial expressions is typically observed in FG, STS and anterior temporal cortex 22,62,63 . Additionally, goal-oriented actions can be reliably classified in LOTC and interior parietal lobe 25,64 .…”
Section: Decoding Of Perceptual Dimensions From Brain Activation Patternsmentioning
confidence: 99%