2009
DOI: 10.1016/j.physa.2009.02.003
|View full text |Cite
|
Sign up to set email alerts
|

Generalized statistics framework for rate distortion theory

Abstract: Variational principles for the rate distortion (RD) theory in lossy compression are formulated within the ambit of the generalized nonextensive statistics of Tsallis, for values of the nonextensivity parameter satisfying 0 < q < 1 and q > 1. Alternating minimization numerical schemes to evaluate the nonextensive RD function, are derived. Numerical simulations demonstrate the efficacy of generalized statistics RD models.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
42
0

Year Published

2009
2009
2018
2018

Publication Types

Select...
5
1

Relationship

3
3

Authors

Journals

citations
Cited by 11 publications
(42 citation statements)
references
References 49 publications
(125 reference statements)
0
42
0
Order By: Relevance
“…This suggests possible connections between coding theory and the measure of complexity in nonextensive statistical mechanics. Related works are the study of generalized channel capacities [9], the notion of nonadditive information content [10], the presentation of a generalized rate distorsion theory [11]. The first section is devoted to a very short presentation of the source coding context, and to the presentation of the fundamental Shannon source coding theorem.…”
Section: Introductionmentioning
confidence: 99%
“…This suggests possible connections between coding theory and the measure of complexity in nonextensive statistical mechanics. Related works are the study of generalized channel capacities [9], the notion of nonadditive information content [10], the presentation of a generalized rate distorsion theory [11]. The first section is devoted to a very short presentation of the source coding context, and to the presentation of the fundamental Shannon source coding theorem.…”
Section: Introductionmentioning
confidence: 99%
“…Instead, as established in this paper, the dual generalized K-Ld is a scaled Bregman divergence. Future work uses the results derived herein to analyze: (i) the generalized statistics rate distortion theory [2], (ii) the generalized statistics information bottleneck method [3] within the context of scaled Bregman divergences and scaled Bregman informations, and (iii) deformed statistics extensions of the minimum Bregman information principle and their applications in machine learning [36].…”
Section: Summary and Discussionmentioning
confidence: 99%
“…• (i) the generalized K-Ld defined by (6) subjected to the additive duality (dual generalized K-Ld (8) and (15)) is shown to be consistent with the canonical probability that maximizes the dual Tsallis entropy of the form [2,26]:…”
Section: Goal Of This Papermentioning
confidence: 98%
See 1 more Smart Citation
“…The qualitative enhancement of the generalized IB method vis-á-vis an equivalent B-G-S model are demonstrated by the relevance-compression curves. A generalized Bregman RD (GBRD) model is presented [2]. A Tsallis-Bregman lower bound for the RD function is derived.…”
mentioning
confidence: 99%