2017
DOI: 10.1371/journal.pone.0177239
|View full text |Cite
|
Sign up to set email alerts
|

Mapping the emotional face. How individual face parts contribute to successful emotion recognition

Abstract: Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman wer… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

34
233
3
7

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 264 publications
(277 citation statements)
references
References 40 publications
(62 reference statements)
34
233
3
7
Order By: Relevance
“…Furthermore, ventral stream areas encoded facial features prior to facial configuration when faces were presented for 150 ms. This adds to evidence suggesting that emotional face perception is supported by the processing of diagnostic features, such as the eyes and mouth (Wegrzyn, Vogt, Kireclioglu, Schneider, & Kissler, 2017). What is more, configural representations explain behaviour and overlap with behavioural representations, suggesting that it is face configuration that drives expression-selective responses in ventral stream areas and guides behaviour.…”
Section: Discussionsupporting
confidence: 62%
“…Furthermore, ventral stream areas encoded facial features prior to facial configuration when faces were presented for 150 ms. This adds to evidence suggesting that emotional face perception is supported by the processing of diagnostic features, such as the eyes and mouth (Wegrzyn, Vogt, Kireclioglu, Schneider, & Kissler, 2017). What is more, configural representations explain behaviour and overlap with behavioural representations, suggesting that it is face configuration that drives expression-selective responses in ventral stream areas and guides behaviour.…”
Section: Discussionsupporting
confidence: 62%
“…A potential mechanism might be deviant visual scanning behavior. For example, the role of attending to the eyes varies for the recognition of different emotions: while the mouth region contains critical information for identifying happiness, the recognition of fear and anger relies mainly on cues from the eye region (Smith, Cottrell, Gosselin, & Schyns, 2005;Wegrzyn, Vogt, Kireclioglu, Schneider, & Kissler, 2017). FER deficits for negative emotions might therefore originate in reduced attention to the eye region.…”
Section: Schultementioning
confidence: 99%
“…In this paper, we employed GC together with a relevant interval selection approach on synthetic and nonverbal communication data obtained from an experimental setup. Based on the results of Wegrzyn et al (Wegrzyn et al, 2017) we designed our own emotional facial features, capable of capturing emotions even when strong distinct emotions are not visible. Our facial expressions are composed of facial action units, which can be detected by real-time, state of the art computer vision tools.…”
Section: Discussionmentioning
confidence: 99%
“…Wegrzyn et al (Wegrzyn et al, 2017) studied the relevance of facial areas for emotion classification and found differences in the importance of the eye and mouth regions. Facial AUs can be divided into upper and lower AUs (Cohn et al, 2007).…”
Section: Facial Expressive Feature Extractionmentioning
confidence: 99%