2013
DOI: 10.1007/s10548-013-0338-2
|View full text |Cite
|
Sign up to set email alerts
|

Brain Prediction of Auditory Emphasis by Facial Expressions During Audiovisual Continuous Speech

Abstract: The visual cues involved in auditory speech processing are not restricted to information from lip movements but also include head or chin gestures and facial expressions such as eyebrow movements. The fact that visual gestures precede the auditory signal implicates that visual information may influence the auditory activity. As visual stimuli are very close in time to the auditory information for audiovisual syllables, the cortical response to them usually overlaps with that for the auditory stimulation; the n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 63 publications
0
5
0
Order By: Relevance
“…Weikum et al, (2007) reported that 4–6 month old infants can discriminate between languages (French versus English) using visual cues only (i.e., silent videos showing articulations). Electrophysiological work shows that visual cues not only enhance auditory cues (e.g., Crosse et al, 2015; see Peelle & Sommers, 2015 for a review), but that they actually precede them (Strelnikov et al, 2015). In line with this reasoning, Venezia et al (2016) propose that “exposure to visual speech during acquisition of speech production establishes the neural circuitry linking visually-perceived gestures to the speech motor system.” If visual information is used to facilitate speech acquisition, aberrant function of VN could interfere with this process.…”
Section: Discussionmentioning
confidence: 99%
“…Weikum et al, (2007) reported that 4–6 month old infants can discriminate between languages (French versus English) using visual cues only (i.e., silent videos showing articulations). Electrophysiological work shows that visual cues not only enhance auditory cues (e.g., Crosse et al, 2015; see Peelle & Sommers, 2015 for a review), but that they actually precede them (Strelnikov et al, 2015). In line with this reasoning, Venezia et al (2016) propose that “exposure to visual speech during acquisition of speech production establishes the neural circuitry linking visually-perceived gestures to the speech motor system.” If visual information is used to facilitate speech acquisition, aberrant function of VN could interfere with this process.…”
Section: Discussionmentioning
confidence: 99%
“…Indeed, the auditory cortex is modulated by dynamic and congruent visual stimuli (Zvyagintsev et al, 2009 ) and vice-versa (Wolf et al, 2014 ). During face-to-face communication recruitment of auditory cortex is not only increased by language sounds, but also cross-modally by visual input such as lip movement and facial expressions (Hertrich et al, 2007 ; Okada et al, 2013 ; Strelnikov et al, 2015 ). In the case of co-speech gestures, Hubbard and colleagues (Hubbard et al, 2009 ) showed that bilateral non-primary auditory cortex exhibited greater activity when speech was accompanied by beat gestures than when speech was presented alone.…”
Section: Discussionmentioning
confidence: 99%
“…The Enhancement model is more biologically plausible, where the auditory part does not use prediction from the visual part but learns from the spiking patterns. This represents a multisensory learning, which helps propagate spikes from the visual group to the auditory group [67].…”
Section: The Enhancement Modelmentioning
confidence: 99%