2022
DOI: 10.1093/cercor/bhab518
|View full text |Cite
|
Sign up to set email alerts
|

Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age

Abstract: The integration of visual and auditory cues is crucial for successful processing of speech, especially under adverse conditions. Recent reports have shown that when participants watch muted videos of speakers, the phonological information about the acoustic speech envelope, which is associated with but independent from the speakers’ lip movements, is tracked by the visual cortex. However, the speech signal also carries richer acoustic details, for example, about the fundamental frequency and the resonant frequ… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
8
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 68 publications
4
8
0
Order By: Relevance
“…Previous work has shown that lip movements activate a network of temporal, parietal and frontal regions (Bourguignon et al, 2020; Calvert et al, 1997; Capek et al, 2008; O’Sullivan et al, 2017; Ozker et al, 2018; Paulesu et al, 2003; Pekkola et al, 2005) and that both occipital and motor regions can align their activity to the dynamics of lip movements (Park et al, 2018, 2016). The present data substantiate this, but also show that the representation of the physically visible lip trajectory along visual pathways is accompanied by the representation of spectral acoustic features, a type of selectivity not directly revealed previously (Suess et al, 2022). Spectral features are vital for a variety of listening tasks (Albouy et al, 2020; Bröhl and Kayser, 2020; Ding and Simon, 2013; Tivadar et al, 2020, 2018), and oro-facial movements provide concise information about the spectral domain.…”
Section: Discussionsupporting
confidence: 90%
See 2 more Smart Citations
“…Previous work has shown that lip movements activate a network of temporal, parietal and frontal regions (Bourguignon et al, 2020; Calvert et al, 1997; Capek et al, 2008; O’Sullivan et al, 2017; Ozker et al, 2018; Paulesu et al, 2003; Pekkola et al, 2005) and that both occipital and motor regions can align their activity to the dynamics of lip movements (Park et al, 2018, 2016). The present data substantiate this, but also show that the representation of the physically visible lip trajectory along visual pathways is accompanied by the representation of spectral acoustic features, a type of selectivity not directly revealed previously (Suess et al, 2022). Spectral features are vital for a variety of listening tasks (Albouy et al, 2020; Bröhl and Kayser, 2020; Ding and Simon, 2013; Tivadar et al, 2020, 2018), and oro-facial movements provide concise information about the spectral domain.…”
Section: Discussionsupporting
confidence: 90%
“…Visual speech contains temporal information that can be predictive of subsequent acoustic signals and allows mapping visual cues onto phonological representations (Campbell, 2008;Lazard and Giraud, 2017). Importantly, the visual cortex tracks dynamic lip signals (Park et al, 2016) and, as suggested recently, may also restore the unheard acoustic envelope of visually presented speech (Hauswald et al, 2018;Suess et al, 2022). Importantly, the evidence that visual speech induces information about the unheard speech acoustics along both auditory and the visual pathways may not be mutually exclusive, as both may contribute to a supramodal frame of reference for speech (Arnal et al, 2009;Rauschecker, 2012).…”
Section: Introductionmentioning
confidence: 85%
See 1 more Smart Citation
“…One view is that the visual system directly contributes to establishing speech representations ( Bernstein et al, 2011 ; O’Sullivan et al, 2017 ; Ozker et al, 2018 ), as oro-facial movements provide temporal information that can be predictive of concurrent acoustic signals and allow mapping visual cues onto phonological representations ( Campbell, 2008 ; Lazard and Giraud, 2017 ). The visual cortex tracks dynamic lip signals ( Park et al, 2016 ) and, as suggested recently, may also directly “restore” the acoustic envelope of the visually presented speech ( Hauswald et al, 2018 ; Suess et al, 2022 ). Another view is that visual speech is mainly represented in regions of the auditory pathways, possibly exploiting speech-specific processes of this system.…”
Section: Introductionmentioning
confidence: 91%
“…A recent study using electrocorticography similarly demonstrated that medial occipital cortex exhibits reliable auditory envelope tracking in the absence of visual speech (Micheli et al, 2020). Other studies have suggested that visual cortex represents unheard auditory speech during silent lipreading by tracking its amplitude envelope (Hauswald et al, 2018) and higher-level linguistic feature representations (Nidiffer et al, 2021; Suess et al, 2022). Correspondingly, we also found evidence of visual cortex tracking the unheard auditory speech envelope in silent lipreading (Fig.…”
Section: Discussionmentioning
confidence: 99%