2018
DOI: 10.1152/jn.00869.2017
|View full text |Cite
|
Sign up to set email alerts
|

Detecting multivariate cross-correlation between brain regions

Abstract: The problem of identifying functional connectivity from multiple time series data recorded in each of two or more brain areas arises in many neuroscientific investigations. For a single stationary time series in each of two brain areas statistical tools such as cross-correlation and Granger causality may be applied. On the other hand, to examine multivariate interactions at a single time point, canonical correlation, which finds the linear combinations of signals that maximize the correlation, may be used. We … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
6
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 14 publications
(16 reference statements)
1
6
0
Order By: Relevance
“…In summary, both dPCA and kdPCA recovered similar independent stimulus and time dimensions when the simulated activity depended on linearly separable temporal and stimulus components. As expected, kdPCA with a linear kernel produced identical results to dPC, demonstrating that these two formulations are equivalent (Rodu et al, 2018). With a Gaussian kernel, kdPCA achieved close to the same performance as dPCA (the optimal method for this example).…”
Section: Example 1: Low-dimensional Summation Of Componentssupporting
confidence: 75%
See 1 more Smart Citation
“…In summary, both dPCA and kdPCA recovered similar independent stimulus and time dimensions when the simulated activity depended on linearly separable temporal and stimulus components. As expected, kdPCA with a linear kernel produced identical results to dPC, demonstrating that these two formulations are equivalent (Rodu et al, 2018). With a Gaussian kernel, kdPCA achieved close to the same performance as dPCA (the optimal method for this example).…”
Section: Example 1: Low-dimensional Summation Of Componentssupporting
confidence: 75%
“…I present an extension to dPCA to find nonlinear task-related components. This method is related to kernel-based extensions of standard principal component analysis (PCA) (Schölkopf et al, 1998), canonical correlation analysis (CCA) (Lai & Fyfe, 2000;Hardoon et al, 2004;Rodu et al, 2018), and kernel regularized least-squares regression (Hainmueller & Hazlett, 2014). In this method, the data points are projected from neural activity space (R N ) into a potentially higher-dimensional space (A).…”
Section: Introductionmentioning
confidence: 99%
“…5). This characterization would have been difficult with existing time series methods that incorporate dimensionality reduction but discard within-area activity as noise [54, 55].…”
Section: Discussionmentioning
confidence: 99%
“…5). This characterization would have been difficult with existing time series methods that incorporate dimensionality reduction but discard within-area activity as noise [54,55]. DLAG also offers unique advantages when characterizing the temporal structure of activity within and across areas.…”
Section: Discussionmentioning
confidence: 99%
“…Li & Chen, 2006;Shmueli et al, 2007). In the past, much work has been done to improve CCA for time series data correlation, such as multilinear CCA (Lu, 2013), kernel CCA (Bießmann et al, 2010), and dynamic kernel CCA (Rodu et al, 2018). While useful for providing global measures for predefined time intervals, these techniques can be computationally expensive or difficult to use in evaluating acute stress response or temporal stress correlation changes.…”
Section: Introductionmentioning
confidence: 99%