2011
DOI: 10.3390/e13030595
|View full text |Cite
|
Sign up to set email alerts
|

Entropy Measures vs. Kolmogorov Complexity

Abstract: Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for Rényi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution m t … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
20
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 40 publications
(22 citation statements)
references
References 12 publications
0
20
0
Order By: Relevance
“…By using unsupervised learning, the authors showed that anomalies can be successfully clustered. Wagner and Plattner [7] made use of the Kolmogorov Complexity (related to Shannon entropy) [68,69] in order to detect worms in network traffic. Their work mostly focuses on implementation aspects and scalability and does not propose any specific analysis techniques.…”
Section: Detection Via Feature Distributionsmentioning
confidence: 99%
“…By using unsupervised learning, the authors showed that anomalies can be successfully clustered. Wagner and Plattner [7] made use of the Kolmogorov Complexity (related to Shannon entropy) [68,69] in order to detect worms in network traffic. Their work mostly focuses on implementation aspects and scalability and does not propose any specific analysis techniques.…”
Section: Detection Via Feature Distributionsmentioning
confidence: 99%
“…Adonai Sant'Anna [23] directly determines the algorithmic complexity of a thermal system as its entropy. Andreia Teixeira et al [24] considers Kolmogorov complexity and Shannon entropy as conceptually different measures, but convincingly proves that for any recursive probability distribution, the expected value of complexity is equal to the entropy up to a constant. Peter Grunwald and Paul Vitanyi [20] point out that K(x) is not a recursive function, and Kolmogorov complexity in general is not calculable in the sense that there is no a computer program that could calculate K in the Equation (3) for any description x, although in simpler and special cases, it may be calculated, while for more complex cases, it could be semi-calculated (i.e., approximately estimated).…”
Section: Algorithmic Complexity and Its Similarity To Entropymentioning
confidence: 99%
“…Despite this theoretical nice property, the practical use of this measure is limited not only by its incomputability, but also by the imprecision of the compressor, when used in its more feasible form. On the contrary, its Shannon entropy version [50], [52], [61] gives better results. The latter, that deals with PDFs instead of strings of symbols, is strictly related to the concept of MI.…”
Section: Perception-based Similarity Metricmentioning
confidence: 97%