2008
DOI: 10.1103/physreve.77.056215
|View full text |Cite
|
Sign up to set email alerts
|

Kernel-Granger causality and the analysis of dynamical networks

Abstract: We propose a method of analysis of dynamical networks based on a recent measure of Granger causality between time series, based on kernel methods. The generalization of kernel-Granger causality to the multivariate case, here presented, shares the following features with the bivariate measures: ͑i͒ the nonlinearity of the regression model can be controlled by choosing the kernel function and ͑ii͒ the problem of false causalities, arising as the complexity of the model increases, is addressed by a selection stra… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
152
0

Year Published

2010
2010
2017
2017

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 153 publications
(154 citation statements)
references
References 43 publications
2
152
0
Order By: Relevance
“…A second way to calculate these linear indices can be found in [8], where GC was equivalently reformulated and included a testing statistical procedure SP detailed in [8] to decide if GCI equals zero or not and to handle overfitting, i.e. to eliminate possible parasitic redundancies in the predictive variables set.…”
Section: B Granger Causalitymentioning
confidence: 99%
See 2 more Smart Citations
“…A second way to calculate these linear indices can be found in [8], where GC was equivalently reformulated and included a testing statistical procedure SP detailed in [8] to decide if GCI equals zero or not and to handle overfitting, i.e. to eliminate possible parasitic redundancies in the predictive variables set.…”
Section: B Granger Causalitymentioning
confidence: 99%
“…(2) respectively, aside from the redundancies correction effect. A generalization of the method was extended to a nonlinear case based on kernel methods [8]. It uses the inhomogeneous polynomial (IP) kernel of order p and the Gaussian kernel of width σ .…”
Section: B Granger Causalitymentioning
confidence: 99%
See 1 more Smart Citation
“…A convenient nonlinear generalization of GC has been implemented in [14] by exploiting the kernel trick, which makes computation of dot products in high-dimensional feature spaces possible using simple functions (kernels) defined on pairs of input patterns. This trick allows the formulation of nonlinear variants of any algorithm that can be cast in terms of dot products, for example Support Vector Machines [15].…”
Section: Granger Causalitymentioning
confidence: 99%
“…This trick allows the formulation of nonlinear variants of any algorithm that can be cast in terms of dot products, for example Support Vector Machines [15]. Thus, although the aim in [14] is still to perform linear GC, it does it within a space defined by the nonlinear features of the data. This projection is conveniently and implicitly performed through kernel functions [16] in addition to use a statistical procedure to avoid over-fitting.…”
Section: Granger Causalitymentioning
confidence: 99%