1998
DOI: 10.1109/18.705557
|View full text |Cite
|
Sign up to set email alerts
|

Joint source-channel coding and guessing with application to sequential decoding

Abstract: We extend our earlier work on guessing subject to distortion to the joint source-channel coding context. We consider a system in which there is a source connected to a destination via a channel and the goal is to reconstruct the source output at the destination within a prescribed distortion level with respect to (w.r.t.) some distortion measure. The decoder is a guessing decoder in the sense that it is allowed to generate successive estimates of the source output until the distortion criterion is met. The pro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
37
0

Year Published

1998
1998
2024
2024

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 42 publications
(42 citation statements)
references
References 15 publications
2
37
0
Order By: Relevance
“…These bounds, expressed in terms of the Rényi entropy, imply that for sufficiently long source sequences, it is possible to make the normalized cumulant generating function of the codeword lengths approach the Rényi entropy as closely as desired by a proper fixed-to-variable uniquely-decodable source code; moreover, a converse result in [13] shows that there is no uniquely-decodable source code for which the normalized cumulant generating function of its codeword lengths lies below the Rényi entropy. In addition, this type of bounds was studied in the context of various coding problems, including guessing (see, e.g., [1], [2], [3], [7], [8], [9], [16], [17], [22], [28], [33], [34], [35], [46], [50]). [26] studied the behavior of the best achievable rate and other fundamental limits in variable-rate lossless source compression without prefix constraints.…”
Section: A Prior Workmentioning
confidence: 99%
“…These bounds, expressed in terms of the Rényi entropy, imply that for sufficiently long source sequences, it is possible to make the normalized cumulant generating function of the codeword lengths approach the Rényi entropy as closely as desired by a proper fixed-to-variable uniquely-decodable source code; moreover, a converse result in [13] shows that there is no uniquely-decodable source code for which the normalized cumulant generating function of its codeword lengths lies below the Rényi entropy. In addition, this type of bounds was studied in the context of various coding problems, including guessing (see, e.g., [1], [2], [3], [7], [8], [9], [16], [17], [22], [28], [33], [34], [35], [46], [50]). [26] studied the behavior of the best achievable rate and other fundamental limits in variable-rate lossless source compression without prefix constraints.…”
Section: A Prior Workmentioning
confidence: 99%
“…A. Guessing 1) Background: The problem of guessing discrete random variables has various theoretical and operational aspects in information theory (see [1], [2], [3], [10], [11], [14], [17], [31], [32], [41], [54], [55], [56], [59], [65], [68], [74], [75], [85]). The central object of interest is the distribution of the number of guesses required to identify a realization of a random variable X, taking values on a finite or countably infinite set X = {1, .…”
Section: Information-theoretic Applications: Non-asymptotic Boundsmentioning
confidence: 99%
“…In other words, conditions (a) and (b) apply with λ(Q) = min{H(Q), R}. It should be pointed out that in [40], as well as in other related works on various settings of the guessing problem [1], [2], [3], [45], the technique proposed by Observation 1 was actually already used (at least implicitly) to address all these problems.…”
Section: Universal Asymptotically Optimum Strategies the Optimum Strmentioning
confidence: 99%