2021
DOI: 10.1016/j.crneur.2021.100014
|View full text |Cite
|
Sign up to set email alerts
|

Auditory detection is modulated by theta phase of silent lip movements

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
15
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(17 citation statements)
references
References 68 publications
2
15
0
Order By: Relevance
“…It has been proposed that visual cues reset auditory delta/theta dynamics to prepare them for upcoming acoustic information (Biau et al, 2021;Mégevand et al, 2020;Thorne & Debener, 2014). Processing of audio-visual speech information leads to shorter neural responses latencies than those evoked by auditory-only speech (van Wassenhove et al, 2005), and the presentation of distinct acoustic and visual consonantal information can lead to the percept of a third consonant (Mcgurk & Macdonald, 1976).…”
Section: E Cross-modality and Sensory-motor Interactionsmentioning
confidence: 99%
See 2 more Smart Citations
“…It has been proposed that visual cues reset auditory delta/theta dynamics to prepare them for upcoming acoustic information (Biau et al, 2021;Mégevand et al, 2020;Thorne & Debener, 2014). Processing of audio-visual speech information leads to shorter neural responses latencies than those evoked by auditory-only speech (van Wassenhove et al, 2005), and the presentation of distinct acoustic and visual consonantal information can lead to the percept of a third consonant (Mcgurk & Macdonald, 1976).…”
Section: E Cross-modality and Sensory-motor Interactionsmentioning
confidence: 99%
“…For example, visual speech cues reset auditory dynamics in general, not only speech-specific ones (Biau et al, 2021). Some evidence for speech-specific effects has been reported for auditory-motor interactions.…”
Section: B4 Cross-modality and Sensory-motor Interactionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, other sources of interference with putative MMN measurements may emerge as unintended consequences of subject behaviour during passive auditory oddball experiments. For instance, two common approaches are to instruct the subject to watch a silent film or read a book, which present different sets of cues that could unintentionally modulate the auditory response in different respects (Biau et al, 2021;Rayner and Clifton, 2009;Zoefel, 2021). As such, demonstrating that MMN reflects a specific prediction-error signal is challenging, and the balance of evidence from the current study leans decidedly towards the hypothesis that intensity MMN reflects modulation of obligatory sensory components of the ERP.…”
Section: Resultsmentioning
confidence: 82%
“…Importantly, a partial coherence between the left motor region oscillations and lip movements rate have also been identified that directly predicted the participants performance on comprehension, suggesting that motor cortex could facilitate the integration of audiovisual speech through predictive coding and active sensing ( Park et al, 2016 , 2018 ). Several recent studies have proposed that visual cortex entrainment to rhythmic lip motion modulates the responses of auditory cortex via theta phase synchronization ( Crosse et al, 2015 ; Zoefel, 2021 ; see Figure 3 ), including when visual speech only is presented ( Bourguignon et al, 2020 ; Biau et al, 2021 ).…”
Section: Evolution and Development Of Multimodal Integration In The P...mentioning
confidence: 99%