2017
DOI: 10.3390/e19060260
|View full text |Cite
|
Sign up to set email alerts
|

Information Distances versus Entropy Metric

Abstract: Abstract:Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected value of Kolmogorov complexity equals the Shannon entropy. We study the similar relationship between entropy and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 33 publications
0
3
0
Order By: Relevance
“…The Kolmogorov complexity distance (KD) has become an important tool in a wide variety of applications [ 25 ]. It has also been applied in hydrology in scaling problems, since the heterogeneity of catchments and the variability of hydrological processes make scaling (which is performed either in a deterministic or a stochastic framework) so difficult [ 26 ].…”
Section: Methodsmentioning
confidence: 99%
“…The Kolmogorov complexity distance (KD) has become an important tool in a wide variety of applications [ 25 ]. It has also been applied in hydrology in scaling problems, since the heterogeneity of catchments and the variability of hydrological processes make scaling (which is performed either in a deterministic or a stochastic framework) so difficult [ 26 ].…”
Section: Methodsmentioning
confidence: 99%
“…Entropy can reflect the state of a system, and IE can quantifies the noise amount in reconstruction signal. The higher the entropy value of a signal, the sparser and more uncertain it is, indicating more noise in the signal and a lower signal-to-noise ratio (SNR) (and vice versa) [27]. In this paper, the IE of the reconstructed signal is used as part of the fitness evaluation function.…”
Section: ) Fitness Evaluation Function Based On Information Entropymentioning
confidence: 99%
“…The number of feature subspaces ( ) . The mutual information entropy (MIE) [16,17] between samples () i a…”
Section: B Fluctuating Features Division Based On the Proposed Entrop...mentioning
confidence: 99%