2008
DOI: 10.1103/physreve.77.026214
|View full text |Cite
|
Sign up to set email alerts
|

Inferring the directionality of coupling with conditional mutual information

Abstract: Uncovering the directionality of coupling is a significant step in understanding drive-response relationships in complex systems. In this paper, we discuss a nonparametric method for detecting the directionality of coupling based on the estimation of information theoretic functionals. We consider several different methods for estimating conditional mutual information. The behavior of each estimator with respect to its free parameter is shown using a linear model where an analytical estimate of conditional mutu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
169
0
1

Year Published

2013
2013
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 167 publications
(170 citation statements)
references
References 50 publications
0
169
0
1
Order By: Relevance
“…In the latter methods, the probability distribution function estimation depends on the distances between the samples computed using some metric. An example of a metric method is the k-nearest neighbor (kNN [15,25]) algorithm. For more detail on methods of estimation of conditional mutual information and their comparison, see [15].…”
Section: Te Estimationmentioning
confidence: 99%
See 3 more Smart Citations
“…In the latter methods, the probability distribution function estimation depends on the distances between the samples computed using some metric. An example of a metric method is the k-nearest neighbor (kNN [15,25]) algorithm. For more detail on methods of estimation of conditional mutual information and their comparison, see [15].…”
Section: Te Estimationmentioning
confidence: 99%
“…The first is an algorithm based on the discretization of studied variables into Q equiquantal bins (EQQ [24], Q ∈ {2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14}) and the second is a k-nearest neighbor algorithm (kNN [15,25], k ∈ {2, 4, 8, 16, 32, 64, 128, 256, 512}).…”
Section: Implementation Detailsmentioning
confidence: 99%
See 2 more Smart Citations
“…Transfer entropy has been proposed to distinguish effectively driving and responding elements and to detect asymmetry in the interaction of subsystems. It is it widely applicable because it is model-free and sensitive to nonlinear signal properties [32]. Thus the transfer entropy is able to measure the influences that one measure can exert over another.…”
Section: Primary Contribution Of This Workmentioning
confidence: 99%