1989
DOI: 10.1109/18.32121
|View full text |Cite
|
Sign up to set email alerts
|

Information and entropy in strange attractors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
137
0
2

Year Published

1997
1997
2014
2014

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 311 publications
(145 citation statements)
references
References 21 publications
1
137
0
2
Order By: Relevance
“…The choice of an appropriate delay is calculated using the minimal mutual information technique [12,13].…”
Section: Correlation Dimension Analysismentioning
confidence: 99%
“…The choice of an appropriate delay is calculated using the minimal mutual information technique [12,13].…”
Section: Correlation Dimension Analysismentioning
confidence: 99%
“…As a primary definition, the entropy H (X) of a monodimensional discrete random variable X is H (X) = − x i ∈φ p(x i ) log p(x i ), where φ is the set of values and p(x i ) is the ith probability function. Other important definitions include the Kolmogorov-Sinai entropy [4], the K 2 entropy [5], and the marginal redundancy algorithm given by Fraser [6]. To compute these theoretical entropy indices, a large number of data points are needed to achieve convergence.…”
Section: Introductionmentioning
confidence: 99%
“…The analysis of deterministic chaos provides information on the system complexity and can explain a complex behaviour using a low-dimensional model (Fraser, 1989). The algorithm for nonlinear processing was developed in the seventies and eighties and has received noticeable attention in biomedical signal processing a decade later.…”
Section: Processing and Interpretation Of The Electromyogram Signalmentioning
confidence: 99%