2007
DOI: 10.1049/iet-com:20070076
|View full text |Cite
|
Sign up to set email alerts
|

On information theory parameters of infinite anti-uniform sources

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
6
0

Year Published

2011
2011
2015
2015

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 11 publications
0
6
0
Order By: Relevance
“…Compared to the Huffman algorithm, these methods are very complex. A source with n symbols having Huffman code with codelength vector L n = (1, 2, 3, · · · , n − 2, n − 1, n − 1) is called an anti-uniform source [15,16]. Such sources have been shown to correspond to particular probability distributions.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Compared to the Huffman algorithm, these methods are very complex. A source with n symbols having Huffman code with codelength vector L n = (1, 2, 3, · · · , n − 2, n − 1, n − 1) is called an anti-uniform source [15,16]. Such sources have been shown to correspond to particular probability distributions.…”
Section: Introductionmentioning
confidence: 99%
“…For example, it was shown in [17] and [18], respectively, that the normalized tail of the Poisson distribution and the geometric distribution with success probability greater than some critical value are anti-uniform sources. It was demonstrated in [15,16] that a source with probability vector…”
Section: Introductionmentioning
confidence: 99%
“…AUH sources can be generated by several probability distributions. It has been shown that geometric, quasi-geometric, Fibonacci, exponential, Poisson and negative binomial distributions lie in the class of AUH sources for some regimes of their parameters [7], [18], [19], [20]. Related topic was addressed in [21], where the authors studied weakly super increasing (WSI) and partial WSI sources in connection with Fibonacci numbers and golden mean, which appeared extensively in modern science and, in particular, have applications in coding and information theory.…”
Section: Introductionmentioning
confidence: 99%
“…Tight lower and upper bounds on average codeword length, entropy and redundancy of finite and infinite AUH codes in terms of alphabet size are derived. Related topics are addressed in [13]- [15]. The problem of M-ary Huffman codes is analyzed in [16] and it is shown that for AUH codes, by a proper choice of the source probabilities, the average codeword length can be made closed to unity.…”
Section: Introductionmentioning
confidence: 99%