2022
DOI: 10.1101/2022.06.19.496753
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

The domain-separation low-dimensional language network dynamics in the resting-state support the flexible functional segregation and integration during language and speech processing

Abstract: Modern linguistic theories and network science propose that the language and speech processing is organized into hierarchical, segregated large-scale subnetworks, with a core of dorsal (phonological) stream and ventral (semantic) stream. The two streams are asymmetrically recruited in receptive and expressive language or speech tasks, which showed flexible functional segregation and integration. We hypothesized that the functional segregation of the two streams was supported by the underlying network segregati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
6
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 125 publications
1
6
0
Order By: Relevance
“…The third CAP mainly included the seed points and the language network (LAN) (Figures 3 and 4). LAN predominantly consists of the left superior temporal gyri, superior temporal gyri, inferior frontal gyri, and ventral precentral gyri, as documented by various studies (Lipkin et al, 2022;Malik-Moraleda et al, 2022;Yuan, Xie, & Wang et al, 2023c). A high spatial similarity was observed between the time-recurring states across the 10 story scans (Supplementary Figure 4).…”
Section: Seed-based Co-activation Pattern Analysis (Cap) Resultssupporting
confidence: 67%
See 2 more Smart Citations
“…The third CAP mainly included the seed points and the language network (LAN) (Figures 3 and 4). LAN predominantly consists of the left superior temporal gyri, superior temporal gyri, inferior frontal gyri, and ventral precentral gyri, as documented by various studies (Lipkin et al, 2022;Malik-Moraleda et al, 2022;Yuan, Xie, & Wang et al, 2023c). A high spatial similarity was observed between the time-recurring states across the 10 story scans (Supplementary Figure 4).…”
Section: Seed-based Co-activation Pattern Analysis (Cap) Resultssupporting
confidence: 67%
“…Secondly, an exponentially weighted moving average (EWMA) window is applied to the standardized residuals to compute a non-normalized version of the time-varying correlation matrix between Regions of Interest (ROIs). Mathematical expressions of the GARCH (1,1) model, DCC model, EWMA, and estimations of the model parameters can be found in (Lindquist et al, 2014) and related literature (Yuan, Xie, & Gong et al, 2023;Yuan, Xie, & Wang et al, 2023b).…”
Section: Dynamic Conditional Correlation (Dcc)mentioning
confidence: 99%
See 1 more Smart Citation
“…At the listener’s side, the clusters covered an extensive range of brain regions, including the wide range of the prefrontal cortex (CH2, CH5-16) and the classical language regions. Based on the anatomical localization and functional organization within the language network ( 35, 36, 39, 40 ), these language regions were divided into ventral language regions, including right STG, MTG (CH29/31/35), bilateral inferior parietal lobe (IPL, CH23/24/26/33/34/36), and dorsal language regions, including bilateral IFG (CH17/27), bilateral pre- and post- central gyrus (pre-/postCG, CH18/20/22/28/30/31). The channel and localization of these channel combinations are shown in Figure 3A and S1A.…”
Section: Resultsmentioning
confidence: 99%
“…Network reorganization or dynamic functional connectivity (dFC) can be estimated frame by frame or within a short window (~ 20-30 time points). Numerous dFC methods have been proposed, such as the slidingwindow functional connectivity with L1-regularization (SWFC) (Allen et al, 2014;Shakil et al, 2016), dynamic conditional correlation (DCC) (Choe et al, 2017;Lindquist et al, 2014;Yuan et al, 2023a;Yuan et al, 2023b), Multiplication of Temporal Derivatives (MTD) (Shine et al, 2015), Flexible Least Squares (FLS) (Kalaba and Tesfatsion, 1989;Liao et al, 2014), general linear Kalman filter (Kang et al, 2011;Milde et al, 2010), Hidden Markov models (HMM) (Chen et al, 2022;Eavani et al, 2013;Vidaurre et al, 2017), and Hidden semi-Markov models (HSMM) (Shappell et al, 2019). Although these methods have demonstrated their accuracy, reliability, and potential caveats, a comprehensive comparison regarding the efficacy of these methods is lacking.…”
Section: Introductionmentioning
confidence: 99%