1997
DOI: 10.1109/83.568923
|View full text |Cite
|
Sign up to set email alerts
|

Lossless compression of continuous-tone images via context selection, quantization, and modeling

Abstract: Context modeling is an extensively studied paradigm for lossless compression of continuous-tone images. However, without careful algorithm design, high-order Markovian modeling of continuous-tone images is too expensive in both computational time and space to be practical. Furthermore, the exponential growth of the number of modeling states in the order of a Markov model can quickly lead to the problem of context dilution; that is, an image may not have enough samples for good estimates of conditional probabil… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

1999
1999
2017
2017

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 206 publications
(10 citation statements)
references
References 18 publications
0
9
0
Order By: Relevance
“…Within a local context, the strength of the local error has strong correlation with the local variance of the original pixel values. Hence, the number of conditional entropy coding contexts can be reduced by representing each context using its error stength [29,30,31].The stength of error can be estimated using the local gradients which are pre-computed during the initial prediction. Except the top layer of the pyramid, each layer is scanned two times, and the interpolation is done at two stages of interpolation as described in detail in [10].…”
Section: Determination and Quantization Of Entropy Coding Contextmentioning
confidence: 99%
“…Within a local context, the strength of the local error has strong correlation with the local variance of the original pixel values. Hence, the number of conditional entropy coding contexts can be reduced by representing each context using its error stength [29,30,31].The stength of error can be estimated using the local gradients which are pre-computed during the initial prediction. Except the top layer of the pyramid, each layer is scanned two times, and the interpolation is done at two stages of interpolation as described in detail in [10].…”
Section: Determination and Quantization Of Entropy Coding Contextmentioning
confidence: 99%
“…Context-based error modeling has gained much research importance in improving the performance of compression algorithms [4, 22, 42, 43]. It is known that most state-of-the- art lossless coding techniques comprise prediction, context modeling of prediction error followed by entropy encoding [42].…”
Section: Improved Context-based Error Modelingmentioning
confidence: 99%
“…Context modeling of prediction error is a means to separate these distributions thereby adjusting the offset to yield zero Laplacian distribution. Such scheme is referred to as context-based bias cancellation [4, 22, 42, 43] and has been applied to EEG signals using linear predictors [4] and neural network predictors [22]. In [4, 22] contexts were framed by computing the difference between the adjacent two samples.…”
Section: Improved Context-based Error Modelingmentioning
confidence: 99%
See 2 more Smart Citations