2022
DOI: 10.1016/j.neuroimage.2022.119217
|View full text |Cite
|
Sign up to set email alerts
|

Seeing a talking face matters: The relationship between cortical tracking of continuous auditory‐visual speech and gaze behaviour in infants, children and adults

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
19
3

Year Published

2022
2022
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 19 publications
(28 citation statements)
references
References 91 publications
2
19
3
Order By: Relevance
“…For example, neural tracking of auditory‐only nursery rhymes in 10‐ and 14‐month‐olds predicts vocabulary size at 24 months (Menn et al., 2022). For speech stimuli, neural tracking is greater in both infants and adults when audiovisual information from the speaker's face is available (Tan et al., 2022). Neural tracking is also greater in sung than spoken sentences in adults listening to audio only stimuli (der Nederlanden et al., 2020; der Nederlanden et al., 2022).…”
Section: Discussionmentioning
confidence: 99%
“…For example, neural tracking of auditory‐only nursery rhymes in 10‐ and 14‐month‐olds predicts vocabulary size at 24 months (Menn et al., 2022). For speech stimuli, neural tracking is greater in both infants and adults when audiovisual information from the speaker's face is available (Tan et al., 2022). Neural tracking is also greater in sung than spoken sentences in adults listening to audio only stimuli (der Nederlanden et al., 2020; der Nederlanden et al., 2022).…”
Section: Discussionmentioning
confidence: 99%
“…Five‐month‐old infants segmented words from the continuous speech of a target speaker in a multi‐talker environment when they could see the target speaker's articulatory movements, but not when they were presented with a still picture of the speaker (Hollich et al., 2005). Another study found that 5‐month‐old infants’ and adults’ cortical tracking of speech was enhanced when speech was audio‐visual compared to audio‐only, whereas 4‐year‐old children did not show this audio‐visual speech benefit (Tan et al., 2022). Furthermore, studies suggested a developmental shift in infants’ looking behaviour at a talking face, with infants gradually shifting their attention to the speaker's mouth rather than to the eyes between 4 and 8 months, and returning to attending preferentially to the speaker's eyes by 12 months (Lewkowicz & Hansen‐Tift, 2012; Tenenbaum et al., 2015).…”
Section: Discussionmentioning
confidence: 99%
“…For the ERP analyses, we excluded trials in which infants attended to the screen for less than 25% of the duration of the familiarisation phase, similar to previous eye‐tracking and EEG studies that suggested similar exclusion criteria (e.g. 15% in LoBue et al., 2017; 20% in Taylor & Herbert, 2013; also see Tan et al., 2022). This means that if the infant attended to the screen for less than 25% of the video duration time in the familiarisation phase, the subsequent ERP test trials that were part of the same block were discarded.…”
Section: Methodsmentioning
confidence: 99%
“…Although evidence of the neural tracking mechanisms in typically developing children is limited, more recent studies have been able to show successful neural tracking of continuous natural speech in both infants (< 1 year) (Kalashnikova et al, 2018; Tan et al, 2022; Attaheri et al, 2022) and young children between 4 and 9 years old (Vander Ghinst et al, 2019; Ríos-López et al, 2020; Tan et al, 2022). For example, Ríos-López et al (2020) showed that speech-brain coupling already occurs at 4 years of age in the delta-band frequency range, but not at theta frequencies.…”
Section: Introductionmentioning
confidence: 99%