2005
DOI: 10.1007/s11047-004-9619-8
|View full text |Cite
|
Sign up to set email alerts
|

Vector quantization using information theoretic concepts

Abstract: Abstract. The process of representing a large data set with a smaller number of vectors in the best possible way, also known as vector quantization, has been intensively studied in the recent years. Very efficient algorithms like the Kohonen self-organizing map (SOM) and the Linde Buzo Gray (LBG) algorithm have been devised. In this paper a physical approach to the problem is taken, and it is shown that by considering the processing elements as points moving in a potential field an algorithm equally efficient … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
37
0

Year Published

2006
2006
2016
2016

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 60 publications
(38 citation statements)
references
References 20 publications
1
37
0
Order By: Relevance
“…The Cauchy-Schwartz divergence [9] is a fundamental tool in the GRZ development. It allows the calculation of 'distances' between different probability density functions (pdfs).…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The Cauchy-Schwartz divergence [9] is a fundamental tool in the GRZ development. It allows the calculation of 'distances' between different probability density functions (pdfs).…”
Section: Methodsmentioning
confidence: 99%
“…The ITL approach proposed in [4] has overcome this setback by extracting information directly from the observations. Let us consider the Cauchy-Schwartz divergence, between two pdfs p and q, as defined in [9]:…”
Section: Final Remarksmentioning
confidence: 99%
See 1 more Smart Citation
“…The Linde-Buzo-Gray (LBG) algorithm [3,4] and the self-organization map (SOM) algorithm [5][6][7] are two of the most popular vector quantization algorithms. Based on information theoretic concepts, vector quantization algorithms which aim to minimize the Cauchy-Schwartz (C-S) divergence or the Kullback-Leibler (K-L) divergence between the distributions of the original data and the reproduction vectors have also been devised and have been proven to perform better than the LBG and SOM algorithms [8,9].…”
Section: Introductionmentioning
confidence: 99%
“…In terms of distributed vector quantization, the LBG and SOM algorithms have been successfully extended to the distributed case in [29] and the proposed distributed LBG and SOM algorithms achieve performances close to that of the corresponding centralized LBG and SOM algorithms, respectively. Since the simulation results in literature on centralized vector quantization have shown that algorithms based on C-S divergence and K-L divergence can achieve better performances than the LBG and SOM algorithms [8,9], it is a natural thought to develop divergence-based vector quantization algorithms in the field of distributed processing. However, the existing divergence-based vector quantization algorithms [8,9] cannot be directly/easily extended to the distributed case due to the lack of data samples in estimating the global data distribution for each individual node (details are provided in the following section).…”
Section: Introductionmentioning
confidence: 99%