2000
DOI: 10.1080/026432900380472
|View full text |Cite
|
Sign up to set email alerts
|

Structural Encoding and Identification in Face Processing: Erp Evidence for Separate Mechanisms

Abstract: The present study had two aims. The first aim was to explore the possible top-down effect of facerecognition and/or face-identification processes on the formation of structural representation of faces, as indexed by the N170 ERP component. The second aim was to examine possible ERP manifestations of face identification processes as an initial step for assessing their time course and functional neuroanatomy. Identical N170 potentials were elicited by famous and unfamiliar faces in Experiment 1, when both were i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

59
458
7
4

Year Published

2005
2005
2016
2016

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 564 publications
(528 citation statements)
references
References 42 publications
59
458
7
4
Order By: Relevance
“…Reports with adults have suggested the N170 is: (1) affected by repetition (Caharel et al, 2002;Itier & Taylor, 2004;Jemel et al, 2003); (2) not affected by stimulus repetition (e.g., Begleiter, Porjesz & Wang, 1995;Schweinberger. Pfutze & Sommer, 1995;Schweinberger, Pickering, Jentzsch, Burton, & Kaufman 2002); and (3) not affected by familiarity (e.g., Bentin & Deouell, 2000;Eimer, 2000;Rossion et al, 1999). However, a later component, the N250 has been shown to be manipulated by familiarity and repetition (Schweinberger et al, 2002).…”
Section: Discussionmentioning
confidence: 99%
“…Reports with adults have suggested the N170 is: (1) affected by repetition (Caharel et al, 2002;Itier & Taylor, 2004;Jemel et al, 2003); (2) not affected by stimulus repetition (e.g., Begleiter, Porjesz & Wang, 1995;Schweinberger. Pfutze & Sommer, 1995;Schweinberger, Pickering, Jentzsch, Burton, & Kaufman 2002); and (3) not affected by familiarity (e.g., Bentin & Deouell, 2000;Eimer, 2000;Rossion et al, 1999). However, a later component, the N250 has been shown to be manipulated by familiarity and repetition (Schweinberger et al, 2002).…”
Section: Discussionmentioning
confidence: 99%
“…For instance, N200s are larger to pictures of one's own face than to other's faces (Tanaka, Curran, Porterfield, & Collins, 2005), and to famous as compared with unfamiliar faces (Bentin & Deouell, 2000). In the case of expression and race, although automatic vigilance mechanisms make it adaptive to initially devote greater attentional resources to angry faces and faces of racial outgroup members (as reflected in the N100 and P200), individuals with positive or neutral expressions, and those who are racial ingroup members are probably more desirable for greater individuation.…”
Section: Erp Componentsmentioning
confidence: 99%
“…For instance, electrophysiological studies have pinpointed a negative event-related potential (ERP) component peaking between 150 and 190 ms, which is highly sensitive to faces (the N170), distributed maximally over occipito-temporal cortex. Much of this research implicates the N170 in initial structural face encoding [3,4], but finds that it is insensitive to facial identity and familiarity [5] -a dissociation predicted by Bruce and Young's model [2]. Surprisingly, little is known, however, about the neural encoding of social category information.…”
Section: Introductionmentioning
confidence: 99%
“…Given an a priori hypothesis of N170 involvement in social category encoding and its localization to occipito-temporal sites [3][4][5], we measured mean amplitudes between a poststimulus window of 150-190 ms that were averaged together for left occipito-temporal sites (CP5/P3) and together for right occipito-temporal sites (CP6/P4). The N170 in the left hemisphere did not reliably differ for sex-typical (mean = 0.97 mV) versus sex-atypical (mean = 1.09 mV) faces [t(22) = 0.29, P = 0.78].…”
Section: Event-related Potential Datamentioning
confidence: 99%
See 1 more Smart Citation