2018
DOI: 10.3389/fnhum.2017.00653
|View full text |Cite
|
Sign up to set email alerts
|

Linear Representation of Emotions in Whole Persons by Combining Facial and Bodily Expressions in the Extrastriate Body Area

Abstract: Our human brain can rapidly and effortlessly perceive a person’s emotional state by integrating the isolated emotional faces and bodies into a whole. Behavioral studies have suggested that the human brain encodes whole persons in a holistic rather than part-based manner. Neuroimaging studies have also shown that body-selective areas prefer whole persons to the sum of their parts. The body-selective areas played a crucial role in representing the relationships between emotions expressed by different parts. Howe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
8

Relationship

3
5

Authors

Journals

citations
Cited by 10 publications
(12 citation statements)
references
References 70 publications
(113 reference statements)
0
12
0
Order By: Relevance
“…Behavioral and neuroimaging studies have suggested that the human brain could encode whole-persons in a holistic rather than part-based manner ( McKone et al, 2001 ; Maurer et al, 2002 ; Zhang et al, 2012 ; Soria Bauser and Suchan, 2013 ), indicating that the whole-person’s emotional expression might not be integrated from the isolated emotional faces and bodies. Our latest study has found that in the EBA, the whole-person patterns were almost equally associated with weighted sums of face and body patterns, using different weights for happy expressions but equal weights for angry and fearful ones ( Yang et al, 2018 ), but this was not established for the other regions. Although some other regions like MPFC, left STS and precuneus have been demonstrated to be capable of coding the facial and bodily emotions at an abstract level regardless of the sensory cue ( Peelen et al, 2010 ; Klasen et al, 2011 ; Chikazoe et al, 2014 ; Skerry and Saxe, 2014 ; Aube et al, 2015 ; Kim et al, 2017 ; Schirmer and Adolphs, 2017 ), these regions that were not sensitive to whole-person stimuli might not be able to represent the emotions of whole-person stimuli ( Tsao et al, 2003 ; Pinsk et al, 2005 ; Heberlein and Atkinson, 2009 ).…”
Section: Discussionmentioning
confidence: 97%
See 1 more Smart Citation
“…Behavioral and neuroimaging studies have suggested that the human brain could encode whole-persons in a holistic rather than part-based manner ( McKone et al, 2001 ; Maurer et al, 2002 ; Zhang et al, 2012 ; Soria Bauser and Suchan, 2013 ), indicating that the whole-person’s emotional expression might not be integrated from the isolated emotional faces and bodies. Our latest study has found that in the EBA, the whole-person patterns were almost equally associated with weighted sums of face and body patterns, using different weights for happy expressions but equal weights for angry and fearful ones ( Yang et al, 2018 ), but this was not established for the other regions. Although some other regions like MPFC, left STS and precuneus have been demonstrated to be capable of coding the facial and bodily emotions at an abstract level regardless of the sensory cue ( Peelen et al, 2010 ; Klasen et al, 2011 ; Chikazoe et al, 2014 ; Skerry and Saxe, 2014 ; Aube et al, 2015 ; Kim et al, 2017 ; Schirmer and Adolphs, 2017 ), these regions that were not sensitive to whole-person stimuli might not be able to represent the emotions of whole-person stimuli ( Tsao et al, 2003 ; Pinsk et al, 2005 ; Heberlein and Atkinson, 2009 ).…”
Section: Discussionmentioning
confidence: 97%
“…Therefore, the emotions of whole-person expressions should be explored individually rather than in an integrated way from the isolated emotional faces and bodies. Further, one of our latest study has found that in the extrastriate body area (EBA), the whole-person patterns were almost equally associated with weighted sums of face and body patterns, using different weights for happy expressions but equal weights for angry and fearful ones ( Yang et al, 2018 ). So, it remains unclear how the whole-person’s emotion is represented in the human brain and whether the representations of emotions of the face, body, and whole-person expressions can be abstractly formed in specific brain regions.…”
Section: Introductionmentioning
confidence: 99%
“…In addition, a three-dimensional magnetization-prepared rapid-acquisition gradient echo (3D MPRAGE) sequence (TR = 1900 ms, TE = 2.52 ms, TI = 1100 ms, voxel size = 1 mm × 1 mm × 1 mm, matrix size = 256 × 256) was used to acquire the T1-weighted anatomical images. The stimuli were displayed by high-resolution stereo 3D glasses within a VisualStim Digital MRI Compatible fMRI system ( Choubey et al, 2009 ; Liang et al, 2017 ; Yang et al, 2018 ).…”
Section: Methodsmentioning
confidence: 99%
“…At the beginning of each run, there was a 10 s fixation cross, which was followed by a 24 s stimulus block (the same condition and expression) and then a 4 s button task. Successive stimulus blocks were separated by the presentation of a fixation cross for 10 s. In each stimulus block, 12 expression stimuli were presented (each for 1520 ms) with an interstimulus interval (ISI) of 480 ms. During the course of each stimulus block, participants were instructed to carefully watch the facial stimuli, and after the block, a screen appeared with six emotion categories and corresponding button indexes to instruct the participants to press a button to indicate the facial expression they had seen in the previous block ( Liang et al, 2017 ; Yang et al, 2018 ). Participants were provided with one response pad per hand with three buttons each in the fMRI experiment ( Ihme et al, 2014 ), and they were pre-trained to familiarize the button pad before scanning.…”
Section: Methodsmentioning
confidence: 99%
“…Therefore, it is essential to explore the neural representation of whole-person expressions individually rather than in an integrated manner based on the isolated emotional faces and bodies (Zhang et al, 2012;Soria Bauser and Suchan, 2015). Moreover, most previous studies used static emotional images as stimuli, but, considering that the emotions we mostly encounter in a natural context are dynamic, recent studies have proposed that dynamic stimuli are more ecologically valid than their static counterparts (Johnston et al, 2013;Yang et al, 2018). Thus, using dynamic emotional stimuli may be more appropriate to investigate the authentic mechanisms used to recognize emotions in daily life.…”
Section: Introductionmentioning
confidence: 99%