2010
DOI: 10.1109/tit.2009.2039160
|View full text |Cite
|
Sign up to set email alerts
|

Lossy Source Compression Using Low-Density Generator Matrix Codes: Analysis and Algorithms

Abstract: Abstract-We study the use of low-density generator matrix (LDGM) codes for lossy compression of the Bernoulli symmetric source. First, we establish rigorous upper bounds on the average distortion achieved by check-regular ensemble of LDGM codes under optimal minimum distance source encoding. These bounds establish that the average distortion using such bounded degree families rapidly approaches the Shannon limit as the degrees are increased. Second, we propose a family of message-passing algorithms, ranging fr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
52
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 66 publications
(54 citation statements)
references
References 44 publications
1
52
0
Order By: Relevance
“…In particular, on the social networks analyzed in this work, we show that the chance that a null model has the level of balance of the true networks is essentially equal to zero. For all three networks, the level of balance turns out to be even less than the Shannon bound one obtains developing a rate-distortion theory for the null models (25,36,37). What makes our signed networks so balanced is the skewed distribution of the signs of the edges on the users: Users with a large majority of friends, but also users with a large majority of enemies, are not causing any significant frustration.…”
mentioning
confidence: 93%
“…In particular, on the social networks analyzed in this work, we show that the chance that a null model has the level of balance of the true networks is essentially equal to zero. For all three networks, the level of balance turns out to be even less than the Shannon bound one obtains developing a rate-distortion theory for the null models (25,36,37). What makes our signed networks so balanced is the skewed distribution of the signs of the edges on the users: Users with a large majority of friends, but also users with a large majority of enemies, are not causing any significant frustration.…”
mentioning
confidence: 93%
“…Similarly to what done in [5], after each message-passing run we removed at least r m = 1% of the variables (starting from the more biased ones), as well as all the variables with a bias greater or equal than B m = 0.7, until a maximum of r m = 10% of the variables was removed. We also borrowed from [5] the choices of w i = exp(0.05) and w s = exp(0.10).…”
Section: Resultsmentioning
confidence: 99%
“…DFTq and IDFTq denote q-point Fourier transformation and its inverse, respectively. As a remark, we were able to derive the given formula for [λ to the limited space, the actual derivations were not included; however, one can notice how these message-updating rules appear as an extension of the ones relative to the GF (2) field and derived in [5].…”
Section: Binary Source Compression With Gf (Q)-quantized Ldgm Codesmentioning
confidence: 99%
See 2 more Smart Citations