2019
DOI: 10.3390/e21121223
|View full text |Cite
|
Sign up to set email alerts
|

Information Theory for Non-Stationary Processes with Stationary Increments

Abstract: We describe how to analyze the wide class of non stationary processes with stationary centered increments using Shannon information theory. To do so, we use a practical viewpoint and define ersatz quantities from time-averaged probability distributions. These ersatz versions of entropy, mutual information and entropy rate can be estimated when only a single realization of the process is available. We abundantly illustrate our approach by analyzing Gaussian and non-Gaussian self-similar signals, as well as mult… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
30
0
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 23 publications
(35 citation statements)
references
References 41 publications
(77 reference statements)
0
30
0
2
Order By: Relevance
“…We report in Table 5 the average fraction of missing data points in a time-window of size T =20 min: increasing the UCO strength is typically associated with an increase of missing data. Let's focus on the entropy rate h k (τ), which is algorithmicaly the most complex quantity: it has been reported that the bias of h k (τ) not only behaves as 1 ( 39 , 40 ), similar to the bias of a sliding average over N points like m k (τ), but also that this bias is small. A time-window of 20 min should contain N = 20 × 60 × 4 = 4, 800 points, and even a reduction of 50% of available data points should leave more that 2,000 points so a bias smaller than 1% ( 40 ).…”
Section: Results and Discussion: Features Time-scales And Distance To Healthy Statementioning
confidence: 99%
See 1 more Smart Citation
“…We report in Table 5 the average fraction of missing data points in a time-window of size T =20 min: increasing the UCO strength is typically associated with an increase of missing data. Let's focus on the entropy rate h k (τ), which is algorithmicaly the most complex quantity: it has been reported that the bias of h k (τ) not only behaves as 1 ( 39 , 40 ), similar to the bias of a sliding average over N points like m k (τ), but also that this bias is small. A time-window of 20 min should contain N = 20 × 60 × 4 = 4, 800 points, and even a reduction of 50% of available data points should leave more that 2,000 points so a bias smaller than 1% ( 40 ).…”
Section: Results and Discussion: Features Time-scales And Distance To Healthy Statementioning
confidence: 99%
“…Let's focus on the entropy rate h k (τ), which is algorithmicaly the most complex quantity: it has been reported that the bias of h k (τ) not only behaves as 1 ( 39 , 40 ), similar to the bias of a sliding average over N points like m k (τ), but also that this bias is small. A time-window of 20 min should contain N = 20 × 60 × 4 = 4, 800 points, and even a reduction of 50% of available data points should leave more that 2,000 points so a bias smaller than 1% ( 40 ). We are thus confident that the reported results are not an indirect measure of the number of missing data points.…”
Section: Results and Discussion: Features Time-scales And Distance To Healthy Statementioning
confidence: 99%
“…To apply the estimators, the N time series must be obtained. Synthetic signals with known H are obtained through the simulation of a series of fractional Gaussian noise (fGn) [24] with stationary increases using the Davies and Harte method described in detail in [25].…”
Section: Working Methodologymentioning
confidence: 99%
“…In [20] they estimate D based on the power-law behavior expressed by the above expression. Moreover from the definition of fractional Brownian motion (fBm), these fBm processes must be governed by [26]   where 0< H < 1 is the Hurst exponent of the fBm process.…”
Section: Fractal Processesmentioning
confidence: 99%