2016 IEEE International Symposium on Information Theory (ISIT) 2016
DOI: 10.1109/isit.2016.7541398
|View full text |Cite
|
Sign up to set email alerts
|

On the smooth Rényi entropy and variable-length source coding allowing errors

Abstract: In this paper, we consider the problem of variable-length source coding allowing errors. The exponential moment of the codeword length is analyzed in the non-asymptotic regime and in the asymptotic regime. Our results show that the smooth Rényi entropy characterizes the optimal exponential moment of the codeword length. Index Termsε source coding, exponential moment, the smooth Rényi entropy, variable-length source coding Does the smooth Rényi entropy H ε α of order α have operational meaning?In this study, we… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
16
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(17 citation statements)
references
References 20 publications
1
16
0
Order By: Relevance
“…The motivation for this work is rooted in the diverse information-theoretic applications of Rényi measures [62]. These include (but are not limited to) asymptotically tight bounds on guessing moments [1], informationtheoretic applications such as guessing subject to distortion [2], joint source-channel coding and guessing with application to sequential decoding [3], guessing with a prior access to a malicious oracle [14], guessing while allowing the guesser to give up and declare an error [50], guessing in secrecy problems [56], [75], guessing with limited memory [64], and guessing under source uncertainty [74]; encoding tasks [12], [13]; Bayesian hypothesis testing [8], [67], [79], and composite hypothesis testing [71], [77]; Rényi generalizations of the rejection sampling problem in [35], motivated by the communication complexity in distributed channel simulation, where these generalizations distinguish between causal and non-causal sampler scenarios [52]; Wyner's common information in distributed source simulation under Rényi divergence measures [87]; various other source coding theorems [15], [23], [24], [36], [49], [50], [68], [76], [78], [79], channel coding theorems [4], [5], [26], [60], [66], [78], [79], [86], including coding theorems in quantum information theory…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The motivation for this work is rooted in the diverse information-theoretic applications of Rényi measures [62]. These include (but are not limited to) asymptotically tight bounds on guessing moments [1], informationtheoretic applications such as guessing subject to distortion [2], joint source-channel coding and guessing with application to sequential decoding [3], guessing with a prior access to a malicious oracle [14], guessing while allowing the guesser to give up and declare an error [50], guessing in secrecy problems [56], [75], guessing with limited memory [64], and guessing under source uncertainty [74]; encoding tasks [12], [13]; Bayesian hypothesis testing [8], [67], [79], and composite hypothesis testing [71], [77]; Rényi generalizations of the rejection sampling problem in [35], motivated by the communication complexity in distributed channel simulation, where these generalizations distinguish between causal and non-causal sampler scenarios [52]; Wyner's common information in distributed source simulation under Rényi divergence measures [87]; various other source coding theorems [15], [23], [24], [36], [49], [50], [68], [76], [78], [79], channel coding theorems [4], [5], [26], [60], [66], [78], [79], [86], including coding theorems in quantum information theory…”
Section: Introductionmentioning
confidence: 99%
“…Inequality(43) leads to the application of Theorem 1 with ρ = 2 (see(46)). In the derivation of Theorem 2, we refer to v(α) := c (∞) α (2) (see (47)-(49)) rather than referring to c (n) α (2) (although, from(24),we have 0 ≤ c (n) α (2) ≤ v(α)for all α > 0). We do so since, for n ≥ 16, the difference between the curves of c (n) α (2) (as a function of α > 0) and the curve of c(∞) α (2) is marginal (see the dashed and solid lines in the left plot of Figure 2), and also because the function v in (33) is expressed in a closed form whereas c (n) α (2) is…”
mentioning
confidence: 99%
“…Hence, it is not hard to modify the theorem for the case where only deterministic encoder mappings are allowed. We omit the details, but see Proposition 1 of [27] for the case of |Y| = 1.…”
Section: Source Codingmentioning
confidence: 99%
“…The fundamental limit is R * (n, D, ǫ, t) := inf{R : ∃ an (n, D, R, ǫ, t) code}. (9) III. PREVIOUS STUDY Courtade and Verdú [3] considered the same problem setting with the restriction that the code (f, g) satisfies P[d(X, g(f (X))) > D] = 0 (i.e., ǫ = 0 in (4)).…”
Section: Problem Formulationmentioning
confidence: 99%
“…Several previous works investigated the fundamental limit of the normalized cumulant generating function of codeword lengths: e.g., [1] and [2] for the problem of variable-length lossless source coding; [9] for the problem of variable-length source coding allowing errors; [3] for the problem of variablelength lossy source coding.…”
Section: Introductionmentioning
confidence: 99%