2018
DOI: 10.1371/journal.pbio.2006558
|View full text |Cite
|
Sign up to set email alerts
|

Representational interactions during audiovisual speech entrainment: Redundancy in left posterior superior temporal gyrus and synergy in left motor cortex

Abstract: Integration of multimodal sensory information is fundamental to many aspects of human behavior, but the neural mechanisms underlying these processes remain mysterious. For example, during face-to-face communication, we know that the brain integrates dynamic auditory and visual inputs, but we do not yet understand where and how such integration mechanisms support speech comprehension. Here, we quantify representational interactions between dynamic audio and visual speech signals and show that different brain re… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

8
88
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 68 publications
(97 citation statements)
references
References 57 publications
8
88
1
Order By: Relevance
“…This phase alignment in turn determines systematic, stimulus-locked variations in neuronal activity, as indexed by fluctuations in broadband high-frequency activity. It was further shown that visual speech gestures enhance intelligibility by facilitating auditory cortical entrainment to the speech stream (Crosse, Butler and Lalor, 2015;Perrodin et al, 2015;Park et al, 2016Park et al, , 2018Di Liberto et al, 2018;Micheli et al, 2018). Here, we used iEEG recordings for a more direct examination of the neurophysiological mechanisms underlying visual enhancement of auditory cortical speech processing.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This phase alignment in turn determines systematic, stimulus-locked variations in neuronal activity, as indexed by fluctuations in broadband high-frequency activity. It was further shown that visual speech gestures enhance intelligibility by facilitating auditory cortical entrainment to the speech stream (Crosse, Butler and Lalor, 2015;Perrodin et al, 2015;Park et al, 2016Park et al, , 2018Di Liberto et al, 2018;Micheli et al, 2018). Here, we used iEEG recordings for a more direct examination of the neurophysiological mechanisms underlying visual enhancement of auditory cortical speech processing.…”
Section: Discussionmentioning
confidence: 99%
“…There is strong support for the phase-reset hypothesis in non-human primates (Perrodin et al, 2015). In humans, noninvasive neurophysiology has brought solid evidence that visual speech entrains oscillatory activity in widespread regions of the cerebral cortex, including areas involved in speech perception and production (Crosse, Butler and Lalor, 2015;Park et al, 2016Park et al, , 2018. However, limitations inherent to noninvasive methods leave two crucial sets of questions unanswered.…”
Section: Introductionmentioning
confidence: 99%
“…PID was previously used to decompose the information brought by acoustic and visual speech signals about brain oscillatory activity, 73 and to compare auditory encoding models of MEG during speech processing. 62 As in these references, we measure redundancy with pointwise common change in surprisal for Gaussian variables.…”
Section: Partial Information Decompositionmentioning
confidence: 99%
“…The partial information decomposition has been applied to data in neuroscience; see, e.g. [46,47,48,49]. For a recent overview, see [50].…”
Section: Application Of Partial Information Decompositionmentioning
confidence: 99%