2002
DOI: 10.1109/tit.2002.1003841
|View full text |Cite
|
Sign up to set email alerts
|

Source coding, large deviations, and approximate pattern matching

Abstract: -We present a development of parts of rate-distortion theory and pattern-matching algorithms for lossy data compression, centered around a lossy version of the Asymptotic Equipartition Property (AEP). This treatment closely parallels the corresponding development in lossless compression, a point of view that was advanced in an important paper of Wyner and Ziv in 1989. In the lossless case we review how the AEP underlies the analysis of the Lempel-Ziv algorithm by viewing it as a random code and reducing it to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
107
0

Year Published

2003
2003
2017
2017

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 83 publications
(111 citation statements)
references
References 85 publications
(134 reference statements)
2
107
0
Order By: Relevance
“…We present and prove process Large Deviation Principle (LDP) for the coloured random graph conditioned to have a given empirical colour measure and empirical pair measure, see Doku-Amponsah (2006), using similar coupling techniques as in the article by Boucheron et al (2002). From this LDP and the techniques employed by Dembo and Kontoyiannis (2002) for the random field on Z 2 , we obtain the proof of the Lossy AEP for the Networked Data Structures.…”
Section: Introductionmentioning
confidence: 93%
“…We present and prove process Large Deviation Principle (LDP) for the coloured random graph conditioned to have a given empirical colour measure and empirical pair measure, see Doku-Amponsah (2006), using similar coupling techniques as in the article by Boucheron et al (2002). From this LDP and the techniques employed by Dembo and Kontoyiannis (2002) for the random field on Z 2 , we obtain the proof of the Lossy AEP for the Networked Data Structures.…”
Section: Introductionmentioning
confidence: 93%
“…Informally, is the minimum rate required for a random codebook with entries chosen independently from a given so as to compress the source with average asymptotic distortion less than [18]. Note that (by substituting ).…”
Section: Theorem 1: Fix a Memoryless Channelmentioning
confidence: 99%
“…As mentioned in the Introduction, this coding scheme (and many variations on it) has been analyzed extensively in [28][29] [8] and several other works cited therein. To avoid potentially infinite searches in the codebook, we make the simplifying assumption that the encoder only describes N n when it is smaller than 2 nb , where b is some positive constant to be chosen later.…”
Section: Naive Codingmentioning
confidence: 99%