1995
DOI: 10.1016/0167-2789(95)00041-2
|View full text |Cite
|
Sign up to set email alerts
|

Generalized redundancies for time series analysis

Abstract: Extensions to various information theoretic quantities used for nonlinear time series analysis are discussed, as well as their relationship to the generalized correlation integral. It is shown that calculating redundancies from the correlation integral can be more accurate and more efficient than direct box counting methods. It is also demonstrated that many commonly used nonlinear statistics have information theory based analogues. Furthermore, the relationship between the correlation integral and information… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
89
0
1

Year Published

2005
2005
2016
2016

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 125 publications
(95 citation statements)
references
References 70 publications
(61 reference statements)
3
89
0
1
Order By: Relevance
“…In the present work, inferences are based on an information-theoretic approach in which dynamic systems are viewed as generators of information (e.g. Prichard & Theiler 1995). The use of information theory in ecology is not new and ranges from species diversity metrics and inferences about diversitystability relationships (e.g.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In the present work, inferences are based on an information-theoretic approach in which dynamic systems are viewed as generators of information (e.g. Prichard & Theiler 1995). The use of information theory in ecology is not new and ranges from species diversity metrics and inferences about diversitystability relationships (e.g.…”
Section: Discussionmentioning
confidence: 99%
“…Following the work of Liebert & Schuster (1989) and Prichard & Theiler (1995), the relevant quantities required by equation (2.1) can be given by…”
Section: Time-delayed Mutual Informationmentioning
confidence: 99%
“…Note that this forcing function is internal to the set of six equations describing the overall flow, but represent an externally applied forcing function for the three equations describing the equations of motion. It is interesting to note that Prichard and Theiler [30], in a study of the behavior of the Rössler attractor, observed that the local entropy is apparently large when the particular parameter is large. We, however, find that the spectral entropy is dependent on the degree of order within the flow element, with lower values of the spectral entropy representing a higher degree of order within the element.…”
Section: Resultsmentioning
confidence: 99%
“…Note that these values of the spectral entropy occur when the fluctuating axial velocity is in the negative range and produces the most vigorous part of the aperiodic motion. Prichard and Theiler [30] indicate that the most energetic of the fluctuating components of velocity will contribute to the spectral entropy through this region and that they will be subjected to the folding and stretching of the flow elements as they lose information to spectral entropy. This region is thus a region of "dissolution" where the incoming low spectral entropy flow is transformed into a high spectral entropy region.…”
Section: Resultsmentioning
confidence: 99%
“…One of the most classical statistical complexities is the mutual information between two stochastic variables, and its generalized form to measure dependence between n variables is proposed (e.g., [13]) and explored in relevance to statistical models and theories by several authors [14][15][16].…”
Section: Introductionmentioning
confidence: 99%