2005
DOI: 10.1016/j.image.2005.03.003
|View full text |Cite
|
Sign up to set email alerts
|

Lloyd clustering of Gauss mixture models for image compression and classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2006
2006
2016
2016

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 44 publications
(27 citation statements)
references
References 55 publications
0
27
0
Order By: Relevance
“…It is paired with aired with its hand-labelled classified image. These photographs have also been studied by Aiyer et al (2005), Li et al (2000), Pyun et al (2007) and Pyun et al (2009). In the analysis, we To design computer-based system to measure the spatial correlation in land use, we first segment the images into manmade and natural area using a block-based classifier.…”
Section: Aerial Imagementioning
confidence: 89%
See 1 more Smart Citation
“…It is paired with aired with its hand-labelled classified image. These photographs have also been studied by Aiyer et al (2005), Li et al (2000), Pyun et al (2007) and Pyun et al (2009). In the analysis, we To design computer-based system to measure the spatial correlation in land use, we first segment the images into manmade and natural area using a block-based classifier.…”
Section: Aerial Imagementioning
confidence: 89%
“…In the analysis, we use 4 × 4 sub-blocks (256 × 256 sub-blocks in total) and estimating the Gauss mixture model for manmade and natura regions using 5 training images. The block processing steps are same with those in Aiyer et al (2005) and Pyun et al (2007) and their details are omitted here. Using the estimate GMMs, we classify each sub-block into the class having the highest likelihood.…”
Section: Aerial Imagementioning
confidence: 99%
“…Therefore, it would be of interest to relax the nearest-neighbor constraint and allow more general encoders. For instance, one could use Gauss Mixture Vector Quantization (GMVQ) [1] to model the distribution of the features as a Gauss mixture, thus allowing for partition cells with quadratic boundaries. Going beyond information loss minimization, our approach readily extends to any Bregman divergence [6], [2], not just the relative entropy.…”
Section: Discussionmentioning
confidence: 99%
“…(2) can be interpreted as the difference between the amount of information provided by X about Y and the amount of information provided by K about Y , and minimizing it leads to a clustering that throws away as little information about Y as possible. 1 Unfortunately, the algorithm of [10] is not suitable for our target application of quantizer design. For one, quantization requires an encoding rule that works on continuous data, does not depend on the labels other than those of the training examples, and can be applied outside the training set.…”
Section: Previous Workmentioning
confidence: 99%
“…In particular, we are interested in fitting Gaussian mixture models to data within a VQ framework [1]. In this approach, the PDF of an input vector X is represented as a weighted collection of Gaussians:…”
Section: Pixel Color Clustering and Kl Divergencementioning
confidence: 99%