IEEE International Symposium on Information Theory, 2003. Proceedings. 2003
DOI: 10.1109/isit.2003.1228266
|View full text |Cite
|
Sign up to set email alerts
|

Source coding for general penalties

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
9
0

Year Published

2006
2006
2013
2013

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(11 citation statements)
references
References 4 publications
2
9
0
Order By: Relevance
“…The infinite sequence of optimal codes obtained when k→∞ (q→0) stabilizes in the limit, as stated in the following proposition, which follows from Proposition 1 (the fact is also mentioned in [14,Ch. 5…”
Section: The Family Of Parameters Q = 2 −Kmentioning
confidence: 61%
See 1 more Smart Citation
“…The infinite sequence of optimal codes obtained when k→∞ (q→0) stabilizes in the limit, as stated in the following proposition, which follows from Proposition 1 (the fact is also mentioned in [14,Ch. 5…”
Section: The Family Of Parameters Q = 2 −Kmentioning
confidence: 61%
“…A matching decoding procedure is easily derived. Encoding and decoding procedures for all the codes in this section are presented in [12] [14]. Kraft functions.…”
Section: ])mentioning
confidence: 99%
“…Remark 2: (a) Note that if we formally substitute α = 0 in (3), it reduces to (1). Since inf M f > 0 (and thus M is bounded) the right hand side of (4) is finite.…”
Section: Resultsmentioning
confidence: 99%
“…From an operational point of view, the use of Rényi entropy as quantizer rate is supported by Campbell's work [4] and showed that Rényi's entropy plays an analogous role to Shannon entropy in this more general setting. An overview of related results can be found in [1].…”
Section: Introductionmentioning
confidence: 99%
“…This approach was first suggested in [13] as an alternative to the Lagrangian rate definition considered there which simultaneously controls codebook size and output (Shannon) entropy. Further motivation for using Rényi entropy as quantization rate can be obtained from axiomatic considerations [27,1], as well as from the operational role of the Rényi entropy in variable-length lossless coding [8,17,2].…”
Section: Introductionmentioning
confidence: 99%