2022
DOI: 10.48550/arxiv.2203.15109
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

27 Open Problems in Kolmogorov Complexity

Abstract: Shannon defined the entropy of a random variable M with k values m 1 , . . . , m k having probabilities p 1 , . . . , p k asThis formula can be informally read as follows: the ith message m i brings us log(1/p i ) "bits of information" (whatever this means), and appears with frequency p i , so H is the expected amount of information provided by one random message (one sample of the random variable). Moreover, we can construct an optimal uniquely decodable code that requires about H (at most H + 1, to be exact)… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 39 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?