2021
DOI: 10.1101/2021.01.23.427890
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Envelope reconstruction of speech and music highlights unique tracking of speech at low frequencies

Abstract: The human brain tracks amplitude fluctuations of both speech and music, which reflects acoustic processing in addition to the processing of higher-order features and one’s cognitive state. Comparing neural tracking of speech and music envelopes can elucidate stimulus-general mechanisms, but direct comparisons are confounded by differences in their envelope spectra. Here, we use a novel method of frequency-constrained reconstruction of stimulus envelopes using EEG recorded during passive listening. We expected … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 84 publications
(119 reference statements)
0
3
0
Order By: Relevance
“…Neural tracking is an essential mechanism subserving an efficient interaction with the external world. It was reliably measured in the visual (Bourguignon et al, 2020;Hauswald et al, 2018;King et al, 2016;Spaak et al, 2014) and auditory (Di Liberto et al, 2015;Keitel et al, 2017;Kösem et al, 2018;Zoefel & VanRullen, 2016;Zuk et al, 2021) systems following continuous stimulation. Here, we extended the measurement of neural tracking to the somatosensory domain, employing a continuous and naturalistic tactile stimulation of the hands.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Neural tracking is an essential mechanism subserving an efficient interaction with the external world. It was reliably measured in the visual (Bourguignon et al, 2020;Hauswald et al, 2018;King et al, 2016;Spaak et al, 2014) and auditory (Di Liberto et al, 2015;Keitel et al, 2017;Kösem et al, 2018;Zoefel & VanRullen, 2016;Zuk et al, 2021) systems following continuous stimulation. Here, we extended the measurement of neural tracking to the somatosensory domain, employing a continuous and naturalistic tactile stimulation of the hands.…”
Section: Discussionmentioning
confidence: 99%
“…Neuronal populations can synchronize their activity (through alignment of the phase) to temporal profiles of a continuous input (Lakatos et al, 2019;Obleser & Kayser, 2019). This neural tracking has been measured with a variety of stimuli and tasks, including visual stimulation (Bourguignon et al, 2020;Hauswald et al, 2018;King et al, 2016;Spaak et al, 2014), speech processing (Di Liberto et al, 2015;Keitel et al, 2017;Kösem et al, 2018;Zoefel & VanRullen, 2016), music (Zuk et al, 2021) and multisensory inputs (Jessen et al, 2019;Thézé et al, 2020). In the case of the forward or encoding models, the mathematical framework underpinning the neural tracking aims to predict ongoing brain activity from continuous stimulus features (e.g., Crosse et al, 2016;Lakatos et al, 2019).…”
Section: Introductionmentioning
confidence: 99%
“…Nevertheless, a somewhat agnostic view on such underlying neural mechanisms would not prevent us from making valuable theoretical and practical use of such measurements. Work using such measures has already contributed to our understanding of speech (Mesgarani et al, 2014;Di Liberto et al, 2015, 2021aDing et al, 2015;Brodbeck et al, 2018;Broderick et al, 2018) and music perception (Tal et al, 2017;Di Liberto et al, 2020, 2021bMarion et al, 2021;Zuk et al, 2021), selective attention (O' Sullivan et al, 2014;Decruy et al, 2020;Fuglsang et al, 2020), multisensory integration (Crosse et al, 2016;Sullivan et al, 2021), and even abstract cognitive processes such as arithmetic (Kulasingham et al, 2021). The work in this Research Topic attempts to portray a wide set of findings while using consistent terminology.…”
Section: Editorial On the Research Topic Neural Tracking: Closing The...mentioning
confidence: 99%