2004
DOI: 10.1103/physreve.70.011106
|View full text |Cite
|
Sign up to set email alerts
|

Entropy in the natural time domain

Abstract: A surrogate data analysis is presented, which is based on the fluctuations of the "entropy" S defined in the natural time domain [Phys. Rev. E 68, 031106 (2003)]]. This entropy is not a static one such as, for example, the Shannon entropy. The analysis is applied to three types of time series, i.e., seismic electric signals, "artificial" noises, and electrocardiograms, and it "recognizes" the non-Markovianity in all these signals. Furthermore, it differentiates the electrocardiograms of healthy humans from tho… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

8
151
1

Year Published

2004
2004
2015
2015

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 126 publications
(160 citation statements)
references
References 20 publications
8
151
1
Order By: Relevance
“…[6]). We emphasize that the aforementioned points should not be misinterpreted as stating that the simple logistic map model treated here can capture the complex heart dynamics, but only can be seen in the following frame: Since sudden cardiac arrest (which may occur even if the electrocardiogram looks similar to that of H) may be considered as a dynamic phase transition [5,6], it is reasonable to expect that the entropy fluctuations significantly increase upon approaching the transition. …”
Section: S < S Umentioning
confidence: 99%
“…[6]). We emphasize that the aforementioned points should not be misinterpreted as stating that the simple logistic map model treated here can capture the complex heart dynamics, but only can be seen in the following frame: Since sudden cardiac arrest (which may occur even if the electrocardiogram looks similar to that of H) may be considered as a dynamic phase transition [5,6], it is reasonable to expect that the entropy fluctuations significantly increase upon approaching the transition. …”
Section: S < S Umentioning
confidence: 99%
“…11), 14) Static entropy solely depends on the probability distribution and hence remains unaltered when changing the order of the events, e.g., upon randomization ("shuffling"), while in a dynamic entropy the order of consecutive events plays an important role. The Shannon entropy is used, because our interest here is focused on the statistical properties, while when studying the dynamic evolution of a system, the "entropy" in the natural time S = <χlnχ > -<χ >ln<χ > should be preferred.…”
Section: Introduction the Best Known Scaling Relation For Earthquakementioning
confidence: 99%
“…This was repeated for various b-values by keeping the total number (500,000) of events constant. The data for b~1 should be equivalent to the "shuffled" 14) data of an actual earthquake catalogue and their probability density functions (pdf) should be the same. (As it is explained further in the Discussion, we focus here on the self-similarity exponent that stems from the distribution of the process' increments only, not from the memory of the process.…”
Section: Introduction the Best Known Scaling Relation For Earthquakementioning
confidence: 99%
See 2 more Smart Citations