1974
DOI: 10.1109/tit.1974.1055204
|View full text |Cite
|
Sign up to set email alerts
|

Error exponent for source coding with a fidelity criterion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
182
0

Year Published

1981
1981
2018
2018

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 153 publications
(183 citation statements)
references
References 3 publications
1
182
0
Order By: Relevance
“…It immediately follows from Lemma 2 that (12) Taking expectation on both sides of (11) with respect to the random choice of the codebooks we have (13) where and is the set of all possible codes for type . For , each code is generated according to the distribution (14) We recall that each subcode is generated according to distribution , where and .…”
Section: Code Constructionmentioning
confidence: 99%
See 1 more Smart Citation
“…It immediately follows from Lemma 2 that (12) Taking expectation on both sides of (11) with respect to the random choice of the codebooks we have (13) where and is the set of all possible codes for type . For , each code is generated according to the distribution (14) We recall that each subcode is generated according to distribution , where and .…”
Section: Code Constructionmentioning
confidence: 99%
“…The distortion for a fixed code with rate parameters and is given by (32) where in (32) we used the bound of (31), the definition of the set in (30), and the bound (12). Now taking the expectation of with respect to the random choice of , we have…”
Section: Define (30)mentioning
confidence: 99%
“…Accordingly, the rate-function R(D; P ; M ) reduces to Shannon's rate-distortion function R(D; P ), and the theorem yields Marton's error-exponents result. [7] Let D ≥ 0 be a given distortion level, and R(D; P ) < R < log |A|. Among all sequences of codebooks {C n } with asymptotic rate no greater than R bits/symbol, lim sup n→∞ 1 n log |C n | ≤ R, the fastest achievable asymptotic rate of decay of the probability of error is…”
Section: Example 2: Lossy Data Compressionmentioning
confidence: 99%
“…where HA is the Rknyi entropy [2] given by log n (14) Therefore, without loss in asymptotic performance, the optimal length function can be assumed to depend on x only through Q,. Let T, be the type of x, namely,…”
Section: S € S X € Xmentioning
confidence: 99%