2017 IEEE International Symposium on Information Theory (ISIT) 2017
DOI: 10.1109/isit.2017.8006779
|View full text |Cite
|
Sign up to set email alerts
|

On empirical cumulant generating functions of code lengths for individual sequences

Abstract: We consider the problem of lossless compression of individual sequences using finite-state (FS) machines, from the perspective of the best achievable empirical cumulant generating function (CGF) of the code length, i.e., the normalized logarithm of the empirical average of the exponentiated code length. Since the probabilistic CGF is minimized in terms of the Rényi entropy of the source, one of the motivations of this study is to derive an individual-sequence analogue of the Rényi entropy, in the same way that… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2018
2018

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 11 publications
0
1
0
Order By: Relevance
“…These bounds, expressed in terms of the Rényi entropy, imply that for sufficiently long source sequences, it is possible to make the normalized cumulant generating function of the codeword lengths approach the Rényi entropy as closely as desired by a proper fixed-to-variable uniquely-decodable source code; moreover, a converse result in [13] shows that there is no uniquely-decodable source code for which the normalized cumulant generating function of its codeword lengths lies below the Rényi entropy. In addition, this type of bounds was studied in the context of various coding problems, including guessing (see, e.g., [1], [2], [3], [7], [8], [9], [16], [17], [22], [28], [33], [34], [35], [46], [50]).…”
Section: A Prior Workmentioning
confidence: 99%
“…These bounds, expressed in terms of the Rényi entropy, imply that for sufficiently long source sequences, it is possible to make the normalized cumulant generating function of the codeword lengths approach the Rényi entropy as closely as desired by a proper fixed-to-variable uniquely-decodable source code; moreover, a converse result in [13] shows that there is no uniquely-decodable source code for which the normalized cumulant generating function of its codeword lengths lies below the Rényi entropy. In addition, this type of bounds was studied in the context of various coding problems, including guessing (see, e.g., [1], [2], [3], [7], [8], [9], [16], [17], [22], [28], [33], [34], [35], [46], [50]).…”
Section: A Prior Workmentioning
confidence: 99%