2012
DOI: 10.1371/journal.pone.0031001
|View full text |Cite
|
Sign up to set email alerts
|

Emotional Cues during Simultaneous Face and Voice Processing: Electrophysiological Insights

Abstract: Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-rela… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

8
36
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 49 publications
(44 citation statements)
references
References 73 publications
(112 reference statements)
8
36
0
Order By: Relevance
“…The EEG analyzing window was between À 200 and 600 ms, and the 200 ms pre-stimulus EEG served as a baseline. According to previous studies (Joyce and Rossion, 2005;Liu et al, 2012;Shannon et al, 2013), in order to measure the most robust effects, parietal-occipital sites were re-referenced to the average of all electrodes; these recording sites yielded a characteristic ERP response that included an early positivegoing component that peaked around 150 ms (posterior P1), followed by a negative-going peak evident 50 ms later (N170). Frontal-central electrode sites were re-referenced to the mean of the right and left mastoids (anterior N1/VPP/N2/P3).…”
Section: Data Measurements and Analysismentioning
confidence: 99%
“…The EEG analyzing window was between À 200 and 600 ms, and the 200 ms pre-stimulus EEG served as a baseline. According to previous studies (Joyce and Rossion, 2005;Liu et al, 2012;Shannon et al, 2013), in order to measure the most robust effects, parietal-occipital sites were re-referenced to the average of all electrodes; these recording sites yielded a characteristic ERP response that included an early positivegoing component that peaked around 150 ms (posterior P1), followed by a negative-going peak evident 50 ms later (N170). Frontal-central electrode sites were re-referenced to the mean of the right and left mastoids (anterior N1/VPP/N2/P3).…”
Section: Data Measurements and Analysismentioning
confidence: 99%
“…These effects have been interpreted as evidence for an early influence of one modality on the other (de Gelder et al, 1999; Pourtois et al, 2000; Liu et al, 2012). Comparing unimodal and multimodal presentations of human communication, Stekelenburg and Vroomen (2007) observed an effect of multimodality on the N100 and the P200 component time-locked to the sound onset.…”
Section: Introductionmentioning
confidence: 99%
“…In the last decades, robust evidence has accumulated showing that we are especially adept at and tuned to decode emotional information from numerous types of signals (e.g., Kousta, Vinson, & Vigliocco, 2009;Liu et al, 2012;Pinheiro, del Re, Mezin, et al, 2013;Pinheiro et al, 2014). This is not surprising, considering the role played by the fast detection of emotional salience in approach versus avoidance behaviors, and its relevance for survival or attainment of goals.…”
mentioning
confidence: 99%