2007
DOI: 10.1186/1744-9081-3-7
|View full text |Cite
|
Sign up to set email alerts
|

The face-specific N170 component is modulated by emotional facial expression

Abstract: Background: According to the traditional two-stage model of face processing, the face-specific N170 event-related potential (ERP) is linked to structural encoding of face stimuli, whereas later ERP components are thought to reflect processing of facial affect. This view has recently been challenged by reports of N170 modulations by emotional facial expression. This study examines the time-course and topography of the influence of emotional expression on the N170 response to faces.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

20
157
1
1

Year Published

2010
2010
2021
2021

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 332 publications
(179 citation statements)
references
References 46 publications
20
157
1
1
Order By: Relevance
“…First, a comparison of semantic and affective catego rization speeds for the same stimuli shows that affective discrim ination is in fact slower than semantic categorization (Nummen maa, Hyona, & Calvo, 2010). Second, the earliest event-related potentials (ERPs) that are consistently modulated by affective stimulus content occur around 170 ms for emotional facial expres sions (e.g., Ashley, Vuilleumier, & Swick, 2004;Batty & Taylor, 2003;Blau et al, 2007;Campanella et al, 2002;Eger et al, 2003;Krombholz, Schaefer, & Boucsein, 2007), and around 200-300 for arousal for complex emotional scenes (see review in Olofsson, Nordin, Sequeira, & Polich. 2008).…”
Section: Can Affective Processing Precede Semantic Recognition?mentioning
confidence: 98%
“…First, a comparison of semantic and affective catego rization speeds for the same stimuli shows that affective discrim ination is in fact slower than semantic categorization (Nummen maa, Hyona, & Calvo, 2010). Second, the earliest event-related potentials (ERPs) that are consistently modulated by affective stimulus content occur around 170 ms for emotional facial expres sions (e.g., Ashley, Vuilleumier, & Swick, 2004;Batty & Taylor, 2003;Blau et al, 2007;Campanella et al, 2002;Eger et al, 2003;Krombholz, Schaefer, & Boucsein, 2007), and around 200-300 for arousal for complex emotional scenes (see review in Olofsson, Nordin, Sequeira, & Polich. 2008).…”
Section: Can Affective Processing Precede Semantic Recognition?mentioning
confidence: 98%
“…The criteria for identifying each ERP peak and latency were established based on the mean global field potential (MGFP) of all participants (Lee et al, 2010;Jung et al, 2012) and based on previous similar studies (Streit et al, 1999;Onitsuka et al, 2006;Blau et al, 2007;Turetsky et al, 2007;Wynn et al, 2008b): the P100 component had the maximum positive potential from 50 to 150 ms after the stimulus onset at electrodes PO7 and PO8; N170 had the largest negative peak in ERP amplitude from 120 to 220 ms at P7/PO7 and P8/PO8; N250 had the biggest negative potential in F1/FC1/FC3 and F2/FC2/FC4 at a latency of 150 to 350 ms; and the P300 component had the largest positive peak at electrodes F1/FC1 and F2/FC2 from 300 to 450 ms post stimulus.…”
Section: Eeg Recording and Erp Analysismentioning
confidence: 99%
“…Of particular interest for the present study are the N170 component, and the later P2 component. The N170 ERP component is reliably observed following face stimuli (Bentin et al, 1996) and has been seen to be modulated by emotional content of the face (Batty & Taylor, 2003;Blau et al, 2007), such that a greater N170 is observed for fearful faces. Other studies have shown that later components are modulated by both emotion (Eimer & Holmes, 2002) and facial recognition (Gosling & Eimer, 2011).…”
Section: Introductionmentioning
confidence: 93%