2000
DOI: 10.1006/csla.2000.0143
|View full text |Cite
|
Sign up to set email alerts
|

Tree-structured vector quantization for speech recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2008
2008
2014
2014

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 14 publications
0
8
0
Order By: Relevance
“…5 Let t = 1 and R = ∅. 6 / * The process of GLA * / 7 Partition the remaining input vector X − R into a set of disjoint blocks B j , j = 1, 2, . .…”
Section: A Notationsmentioning
confidence: 99%
See 1 more Smart Citation
“…5 Let t = 1 and R = ∅. 6 / * The process of GLA * / 7 Partition the remaining input vector X − R into a set of disjoint blocks B j , j = 1, 2, . .…”
Section: A Notationsmentioning
confidence: 99%
“…In general, VQ uses a codebook to encode and decode the signal and transmits the compressed signal over a communication channel. Because it is simple and easy to implement, VQ has been widely used in different applications such as pattern recognition [1], [2], [3], [4], pattern compression [5], [6], speech recognition [7], [8], and face detection [9]. Also called Generalized Lloyd Algorithm (GLA), Linde-Buzo-Gary (LBG) algorithm [10], due to its simplicity and relatively good fidelity, is the most widely used VQ method.…”
Section: Introductionmentioning
confidence: 99%
“…This is probably because DHMM was most popular in 1980's, when the memory resources were quite limited, and couldnot afford large models. The only few works with large codebooks that we are aware of are in [5,53,24].…”
Section: Very Large Codebook Dhmmmentioning
confidence: 99%
“…In [5], both VQ and MMI based tree-structured codebook were investigated for a very large codebook system with 16∼64 K codewords. To robustly estimate the model parameters, three types of tree-structured smoothing techniques (mixture smoothing, smoothing by adding 1, and Gaussian smoothing) were explored.…”
Section: Very Large Codebook Dhmmmentioning
confidence: 99%
See 1 more Smart Citation