2010 IEEE International Symposium on Information Theory 2010
DOI: 10.1109/isit.2010.5513624
|View full text |Cite
|
Sign up to set email alerts
|

Universal estimation of directed information

Abstract: Abstract-Four estimators of the directed information rate between a pair of jointly stationary ergodic finite-alphabet processes are proposed, based on universal probability assignments. The first one is a Shannon-McMillan-Breiman type estimator, similar to those used by Verdú (2005) and Cai, Kulkarni, and Verdú (2006) for estimation of other information measures. We show the almost sure and L1 convergence properties of the estimator for any underlying universal probability assignment. The other three estimato… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 16 publications
(13 citation statements)
references
References 25 publications
0
13
0
Order By: Relevance
“…For example, the directed information [25][26][27][28] and Schreiber's transfer entropy [29] are commonly applied to infer the causality structure and characterize the information transfer process. Moreover, referring to the idea from dynamical system theory, new information transfer measures are proposed to indicate the causality between states and control the systems [30][31][32].…”
Section: Related Work For Information Measures In Big Datamentioning
confidence: 99%
“…For example, the directed information [25][26][27][28] and Schreiber's transfer entropy [29] are commonly applied to infer the causality structure and characterize the information transfer process. Moreover, referring to the idea from dynamical system theory, new information transfer measures are proposed to indicate the causality between states and control the systems [30][31][32].…”
Section: Related Work For Information Measures In Big Datamentioning
confidence: 99%
“…The estimatorÎ 1 is adapted from one universal divergence estimator in [14]. One disadvantage ofÎ 1 (X n → Y n ) is that it has a nonzero probability of being very large, which is overcome byÎ 2 , the estimator introduced in [20], by using information-theoretic functionals to "smooth" the estimate. Evidently we can show |Î 2 | ≤ log |Y|.…”
Section: Four Estimation Algorithmsmentioning
confidence: 99%
“…Based on parametric generalized linear model (GLM) assumption and stationary ergodic Markov assumption [19], they showed strong consistency results. Compared to [19], Zhao, Kim, Permuter, and Weissman [20] focused on universal methods and showed L 1 consistency for all jointly stationary ergodic process pairs with finite alphabet.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…It was pointed out in [24] that when two processes are jointly Gauss-Markov, there is an equivalence between Granger-causality and directed information. Equivalence relations between the two were derived in [4] for Gaussian linear models and in [5] under fairly general frameworks.…”
Section: Introductionmentioning
confidence: 99%