1957
DOI: 10.1002/j.1538-7305.1957.tb03858.x
|View full text |Cite
|
Sign up to set email alerts
|

Instantaneous companding of quantized signals

Abstract: Instantaneous companding may be used to improve the quantized approximation of a signal by producing effectively nonuniform quantization. A revision, extension, and reinterpretation of the analysis of Panter and Dite permits the calculation of the quantizing error power as a function of the degree of companding, the number of quantizing steps, the signal volume, the size of the “equivalent dc component” in the signal input to the compressor, and the statistical distribution of amplitudes in the signal. It appe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
43
0
1

Year Published

1960
1960
2011
2011

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 164 publications
(48 citation statements)
references
References 11 publications
0
43
0
1
Order By: Relevance
“…Gish and Pierce [23] (whose preprint received in the fall of 1967 introduced me to the Panter and Dite reference after my first presentation of some of these results) show that uniform, one-dimensional quantization is asymptotically (large K, small error) optimum in minimizing distortion for given entropy for a large class of measures, including mean rth-power difference measures but not the geometric mean. Wood [33] independently reaches the same conclusion as Gish and Pierce in the mean-square one-dimensional case, citing Roe but not Panter and Dite or Algazi and missing (2). The multidimensional minimal-entropy quantization problem is considered by Schutzenberger [24], who gives inequalities with unknown coefficients on trading relations between entropy and mean rth-power difference measures, and by Zador [4], who gives an asymptotic (large K, small error) result with unknown coefficient for this case too.…”
Section: B History and Literaturementioning
confidence: 65%
See 2 more Smart Citations
“…Gish and Pierce [23] (whose preprint received in the fall of 1967 introduced me to the Panter and Dite reference after my first presentation of some of these results) show that uniform, one-dimensional quantization is asymptotically (large K, small error) optimum in minimizing distortion for given entropy for a large class of measures, including mean rth-power difference measures but not the geometric mean. Wood [33] independently reaches the same conclusion as Gish and Pierce in the mean-square one-dimensional case, citing Roe but not Panter and Dite or Algazi and missing (2). The multidimensional minimal-entropy quantization problem is considered by Schutzenberger [24], who gives inequalities with unknown coefficients on trading relations between entropy and mean rth-power difference measures, and by Zador [4], who gives an asymptotic (large K, small error) result with unknown coefficient for this case too.…”
Section: B History and Literaturementioning
confidence: 65%
“…Max [7], Bruce [8], and Bluestein [9], looking for algorithms for finding optimum quantizers (for signal plus noise in [9]) miss the reference and the result; Algazi [6], finding simpler suboptimal algorithms also misses the reference but cites the quantizer from Roe and rediscovers (2). As Smith [2] notes, Sheppard [101 was the first to give the effect of a uniform quantizer (i.e., grouping statistical data in uniform intervals) on variance in 1898. Sheppard's correction, missed by Clavier et al [11] in the first paper on PCM distortion, is rederived by Bennett [12] and by Oliver et al [13].…”
Section: B History and Literaturementioning
confidence: 99%
See 1 more Smart Citation
“…Each of these methods can normalize by a user-supplied value, by the peak amplitude in a block, or by other methods based on the "local" statistics of the data. A modified mu-law method 16 that we have developed compresses the amplitude logarithmically but preserves the phase. All of the methods preserve the phase as much as possible, since, for our waveforms, it is the phase information that provides most of the signal processing gain.…”
Section: Softwarementioning
confidence: 99%
“…The quantization levels are most closely spaced at low amplitudes, where most speech samples lie$. Smith [3] gives the quantization noise for a p-law quantizer as…”
Section: Introductionmentioning
confidence: 99%