2015
DOI: 10.1103/physreve.92.022126
|View full text |Cite
|
Sign up to set email alerts
|

Normalizing the causality between time series

Abstract: Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoreg… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
98
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
3
2
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 103 publications
(100 citation statements)
references
References 20 publications
2
98
0
Order By: Relevance
“…From Liang (, ), the rate of IF from X 2 (e.g., SM or sounding wind speeds) to X 1 (e.g., T2) can be defined as T21=C11C12C2,d1C122C1,d1C112C22C11C122 where C ij is the sample covariance between X i and X j and C i , dj is the covariance between X i and trueXj̇;trueXj̇ is the difference approximation of d X j /d t using the Euler forward scheme as defined in equation Cij=true()XitrueXi¯()XjtrueXj¯¯ Ci,italicdj=true()XitrueXi¯()X·jtrueX·j¯¯ trueXj̇=Xj,n+1Xj,nt …”
Section: Data and Analysis Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…From Liang (, ), the rate of IF from X 2 (e.g., SM or sounding wind speeds) to X 1 (e.g., T2) can be defined as T21=C11C12C2,d1C122C1,d1C112C22C11C122 where C ij is the sample covariance between X i and X j and C i , dj is the covariance between X i and trueXj̇;trueXj̇ is the difference approximation of d X j /d t using the Euler forward scheme as defined in equation Cij=true()XitrueXi¯()XjtrueXj¯¯ Ci,italicdj=true()XitrueXi¯()X·jtrueX·j¯¯ trueXj̇=Xj,n+1Xj,nt …”
Section: Data and Analysis Methodsmentioning
confidence: 99%
“…It needs to be normalized in order to have its importance assessed. According to Liang (), the variation of X 1 is usually caused by three terms: the IF from X 2 to X 1 ( T 2 → 1 ), variation within itself ( normaldH1*normaldt), and the stochastic effect of noise ( normaldH1noisenormaldt), where H 1 is the evolution of the marginal entropy of X 1 (see details in Liang, and 2014). To evaluate the importance of IF relative to the other two processes, τ 2 → 1 is proposed as τ21=T21Z21 where Z21=||T21+||dH1*dt+||dH1noisedt while normaldH1*normaldt=p normaldH1noisenormaldt=t2C11()Cd1,d1+p2C11+q2C222pCd1,12qCd1,2+2italicpqC12 p=C22C1,italicdt…”
Section: Data and Analysis Methodsmentioning
confidence: 99%
“…[37,38] another decomposition is proposed to detect redundant and synergetic contributions of driving variables. Liang [39,40] presents a rigorous approach based on the underlying Langevin description of a system to define the contributions of internal and external driving to the evolution of the entropy of a subprocess Y . This approach is, however, based on the knowledge of the deterministic-stochastic equations of the system, but in principle it can also be estimated from time series alone involving numerical optimization problems.…”
Section: A Quantifying Causal Information Transfermentioning
confidence: 99%
“…We overcame such limitation by using causality analyses to quantify the connectivity between variables. Such causality or information transfer has been defined within the framework of information theory [69,[84][85][86]. Three basic tenets of the information transfer are the following: (1) Causality implies correlation but correlation does not imply causality, (2) Causality implies directionality, which means that the transfer of information detects the direction of information transfer between two systems, and (3) Asymmetry is a basic property of information transfer.…”
Section: Appendix a Correlation And Causalitymentioning
confidence: 99%
“…A measure of relative flow of information [86], in terms of the marginal entropy, can be given as, Additionally, the two components of entropy (eqn. A7), estimated in terms of p and q are [86] p dt …”
Section: Appendix a Correlation And Causalitymentioning
confidence: 99%