2014
DOI: 10.3390/e16073732
|View full text |Cite
|
Sign up to set email alerts
|

On the Connections of Generalized Entropies With Shannon and Kolmogorov–Sinai Entropies

Abstract: Abstract:We consider the concept of generalized Kolmogorov-Sinai entropy, where instead of the Shannon entropy function, we consider an arbitrary concave function defined on the unit interval, vanishing in the origin. Under mild assumptions on this function, we show that this isomorphism invariant is linearly dependent on the Kolmogorov-Sinai entropy.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 27 publications
0
10
0
Order By: Relevance
“…By describing the chaos in a dynamic system, the Kolmogorov entropy is expected to be strongly connected with the Lyapunov exponent λ; see [21]. For more information about the theoretical aspects of entropy, its generalizations and entropy-like measures, which can be used to measure the complexity of a system, see [22][23][24][25][26]. Kolmogorov entropy for 1D-regular, chaotic-deterministic and random systems.…”
Section: Definition 3 (Kolmogorov Entropy)mentioning
confidence: 99%
“…By describing the chaos in a dynamic system, the Kolmogorov entropy is expected to be strongly connected with the Lyapunov exponent λ; see [21]. For more information about the theoretical aspects of entropy, its generalizations and entropy-like measures, which can be used to measure the complexity of a system, see [22][23][24][25][26]. Kolmogorov entropy for 1D-regular, chaotic-deterministic and random systems.…”
Section: Definition 3 (Kolmogorov Entropy)mentioning
confidence: 99%
“…where p i denotes the probability mass associated with the variable (a) X i such that ∑ p i = 1, and the maximum entropy value is given by H max = log 2 m. This definition of the Shannon entropy has a relation with KSE in terms of its supremum as (Falniowski (2014))…”
Section: Entropy and Correlation Coefficientsmentioning
confidence: 99%
“…It is determined by identifying points on the trajectory in phase space that is similar to each other and not correlated with time. Divergence rate of these point pairs yields the value of KSE [60], calculated as where C m (r, N m ) is the correlation function which provides probability of two points being closer to each other than r. Higher KSE value signifies higher unpredictability. Hence KSE does not give the accurate results for signals with slightest noise.…”
Section: Kolmogorov Sinai Entropymentioning
confidence: 99%