1998
DOI: 10.1109/18.681318
|View full text |Cite
|
Sign up to set email alerts
|

Information distance

Abstract: While Kolmogorov complexity is the accepted absolute measure of information content in an individual finite object, a similarly absolute notion is needed for the information distance between two individual objects, for example, two pictures. We give several natural definitions of a universal information metric, based on length of shortest programs for either ordinary computations or reversible (dissipationless) computations. It turns out that these definitions are equivalent up to an additive logarithmic term.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
385
0
4

Year Published

2005
2005
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 401 publications
(391 citation statements)
references
References 25 publications
2
385
0
4
Order By: Relevance
“…It is also quite amusing to see a quantity appearing that is known as information-distance in other contexts (see e.g. [5]). …”
Section: Example 8 Consider the Class Of Statesmentioning
confidence: 99%
“…It is also quite amusing to see a quantity appearing that is known as information-distance in other contexts (see e.g. [5]). …”
Section: Example 8 Consider the Class Of Statesmentioning
confidence: 99%
“…Our basic tool, and a recent development in the field of information theory, is the universal information distance (27,28), which can be applied to any two objects stored on a computer (e.g., networks, genome sequences, or in our case, macrophage system states). This distance uniquely specifies the informational difference between two objects and is defined in terms of the Kolmogorov complexity.…”
Section: Quantifying Information Processing and Flowmentioning
confidence: 99%
“…These universal codes are optimal for compressing ergodic sources and are still sufficiently computable for use in practice. The information distance and information metric introduced in [1,11] express how similar two objects are. Complementary to independence tests, similar objects have low distance or metric value.…”
Section: Introductionmentioning
confidence: 99%
“…1 One can think of a sum-test as a test for randomness for the case of a semimeasure on a discrete domain. Namely, if d is a P -sum-test, then for every n it easily follows from (1) that the set {x : d(x) ≥ n} has weight ≤ 2 −n under the semimeasure P .…”
Section: Introductionmentioning
confidence: 99%