1999
DOI: 10.1142/s0129183199000528
|View full text |Cite
|
Sign up to set email alerts
|

Information Content in Uniformly Discretized Gaussian Noise: Optimal Compression Rates

Abstract: Abstract. We approach the theoretical problem of compressing a signal dominated by Gaussian noise. We present expressions for the compression ratio which can be reached, under the light of Shannon's noiseless coding theorem, for a linearly quantized stochastic Gaussian signal (noise). The compression ratio decreases logarithmically with the amplitude of the frequency spectrum P (f ) of the noise. Entropy values and compression rates are shown to depend on the shape of this power spectrum, given different norma… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2001
2001
2010
2010

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 14 publications
0
7
0
Order By: Relevance
“…Resuming what discussed in Romeo et al (1999), Maris et al (2000a), Gaztañaga et al (2000), and Gaztañaga et al (2001) and using the formalism introduced in Maris et al (2000a), the maximum compression rate, C r , achievable by any lossless compression method for any digitized signal represented by integers of N bits bits and with Shannon's entropy H is (Nelson & Gailly 1996) …”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Resuming what discussed in Romeo et al (1999), Maris et al (2000a), Gaztañaga et al (2000), and Gaztañaga et al (2001) and using the formalism introduced in Maris et al (2000a), the maximum compression rate, C r , achievable by any lossless compression method for any digitized signal represented by integers of N bits bits and with Shannon's entropy H is (Nelson & Gailly 1996) …”
Section: Discussionmentioning
confidence: 99%
“…As amply discussed in Romeo et al (1999), Maris et al (2000a), Gaztañaga et al (2000) and Gaztañaga et al (2001) in the case of CMB data, the output of the acquisition chain is white-noise dominated. Taking as a representative case P/LFI, the best compression rate, C r , achievable by compressing the output of the acquisition chain, even in the case of an ideal compressor, is C r < 2.7, to be compared with a required C r > ∼ 8.…”
Section: Introductionmentioning
confidence: 99%
“…For nearly-Gaussian probability distributions digitized to b bits per σ, the Shannon entropy is H ≈ log 2 √ 2πe + log 2 b(e.g. Romeo et al 1999). For b = 1 a numerical evaluation of the Shannon entropy shows that ≥ 2.1 bpp will be required.…”
Section: Codec Noisementioning
confidence: 99%
“…The variations of the spectral index in different observations by XMM-Newton, ASCA and BeppoSAX suggest that the source undergoes a spectral transition similar to those in the other XRBs. The EGRET source 3EG J1639−4702 (Romero et al 1999) could be associated with the X-ray source considering the large position uncertainties. The infrared images from the 2MASS database and the optical observations indicate a bright and massive companion for the X-ray source .…”
Section: Igr J16318−4848: This Transient Source Was Discovered Bymentioning
confidence: 99%