2019
DOI: 10.1109/tsp.2019.2920472
|View full text |Cite
|
Sign up to set email alerts
|

Probing High-Order Dependencies With Information Theory

Abstract: Information theoretic measures (entropies, entropy rates, mutual information) are nowadays commonly used in statistical signal processing for real-world data analysis. The present work proposes the use of Auto Mutual Information (Mutual Information between subsets of the same signal) and entropy rate as powerful tools to assess refined dependencies of any order in signal temporal dynamics. Notably, it is shown how two-point Auto Mutual Information and entropy rate unveil information conveyed by higher order st… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
17
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(19 citation statements)
references
References 51 publications
(75 reference statements)
2
17
0
Order By: Relevance
“…When k/T 1 m+1 is reduced, first the bias is positive and diminishes toward negative values and then converges to zero. This behavior was previously reported for the k-nn mutual information estimator applied for stationary processes [16,38,39], and we confirm it is valid for the fBm. We observed the same convergence for a large range of scales τ > 1: the ersatz entropy rate then converges to H 1 fBm + H ln τ for large T with the same behavior of the bias.…”
Section: Convergence / Biassupporting
confidence: 90%
See 3 more Smart Citations
“…When k/T 1 m+1 is reduced, first the bias is positive and diminishes toward negative values and then converges to zero. This behavior was previously reported for the k-nn mutual information estimator applied for stationary processes [16,38,39], and we confirm it is valid for the fBm. We observed the same convergence for a large range of scales τ > 1: the ersatz entropy rate then converges to H 1 fBm + H ln τ for large T with the same behavior of the bias.…”
Section: Convergence / Biassupporting
confidence: 90%
“…Fig. 4b shows that for a fixed window size T = 2 16 the ersatz entropy rate is proportional to H ln(τ). We have added a black line defined by the linear function H FBM 1 + H ln τ, as suggested by eq.…”
Section: Dependence On τmentioning
confidence: 98%
See 2 more Smart Citations
“…We report in Table 5 the average fraction of missing data points in a time-window of size T =20 min: increasing the UCO strength is typically associated with an increase of missing data. Let's focus on the entropy rate h k (τ), which is algorithmicaly the most complex quantity: it has been reported that the bias of h k (τ) not only behaves as 1 ( 39 , 40 ), similar to the bias of a sliding average over N points like m k (τ), but also that this bias is small. A time-window of 20 min should contain N = 20 × 60 × 4 = 4, 800 points, and even a reduction of 50% of available data points should leave more that 2,000 points so a bias smaller than 1% ( 40 ).…”
Section: Results and Discussion: Features Time-scales And Distance To Healthy Statementioning
confidence: 99%