2018
DOI: 10.1016/j.cortex.2018.03.031
|View full text |Cite
|
Sign up to set email alerts
|

Does dynamic information about the speaker's face contribute to semantic speech processing? ERP evidence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(8 citation statements)
references
References 46 publications
0
7
1
Order By: Relevance
“…Two previous studies have investigated the impact of mouth within the N400 time window. Hernández-Gutiérrez and colleagues did not find any N400 difference between audiovisual and audio-only speech [54]; while Brunellière and colleagues found an increase in N400 amplitude for more informative mouth movements [11]. Further research is necessary to clarify these discrepancies, however, our results suggest that mouth informativeness can affect processing in the N400 time window but only in combination with other cues in a multimodal context.…”
Section: Discussioncontrasting
confidence: 76%
See 1 more Smart Citation
“…Two previous studies have investigated the impact of mouth within the N400 time window. Hernández-Gutiérrez and colleagues did not find any N400 difference between audiovisual and audio-only speech [54]; while Brunellière and colleagues found an increase in N400 amplitude for more informative mouth movements [11]. Further research is necessary to clarify these discrepancies, however, our results suggest that mouth informativeness can affect processing in the N400 time window but only in combination with other cues in a multimodal context.…”
Section: Discussioncontrasting
confidence: 76%
“…However, while Brunellière and colleagues compared N400 of words starting with more or less informative mouth movements (/b/ v.s. /k/) and found that words with more informative mouth movements elicited more negative N400 [11], suggesting increased processing difficulty, Hernández-Gutiérrez and colleagues failed to find any N400 effect associated with mouth movements when comparing videos with dynamic facial and mouth movement and with a still image of the speaker [54].…”
Section: Introductionmentioning
confidence: 99%
“…As both facial speech processing and phoneme/grapheme associations are fundamentally audiovisual processes (Francisco et al, 2018 ), and have been shown to partly share neural circuitry (Blomert, 2011 ), we were particularly interested in examining the possibility that the presence of an articulating mouth may enhance the quality of word encoding, by making phoneme/grapheme pairings clearer. Interestingly, the presence of articulation cues has been shown in prior experimental research to affect aspects of psycholinguistic processing, including facilitating upcoming word recognition (Hernández-Gutiérrez et al, 2018 ) and encoding during voice learning (Sheffert & Olson, 2004 ). For instance, in one study (Hernández-Gutiérrez et al, 2018 ), adults listened to short stories in which one target word was either expected from the story context or unexpected.…”
Section: Studymentioning
confidence: 99%
“…Two previous studies have investigated the impact of mouth movements within the N400 time window. Hernández-Gutiérrez and colleagues did not find any N400 difference between audiovisual and audio-only speech 52 ;…”
Section: Prosody Gesture and Mouth Movements Contribution To Linguismentioning
confidence: 92%
“…Two electrophysiological studies, however, reported conflicting findings. While Brunellière and colleagues linked more informative mouth movements to more negative N400 amplitude 51 , generally indicating increased processing difficulty, Hernández-Gutiérrez and colleagues failed to find any N400 effect associated with mouth movements 52 .…”
Section: Introductionmentioning
confidence: 96%