2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583)
DOI: 10.1109/icsmc.2004.1401294
|View full text |Cite
|
Sign up to set email alerts
|

A new generalized LVQ algorithm via harmonic to minimum distance measure transition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 6 publications
0
3
0
Order By: Relevance
“…( 3) are replaced by harmonic average distances [74]. For more details the reader is referred to [54].…”
Section: Lvq Methods Based On Margin Maximizationmentioning
confidence: 99%
See 1 more Smart Citation
“…( 3) are replaced by harmonic average distances [74]. For more details the reader is referred to [54].…”
Section: Lvq Methods Based On Margin Maximizationmentioning
confidence: 99%
“…Other LVQ improvement deals with the initialization sensitivity of the original LVQ algorithms and GLVQ [21,31,53,54]. Recent extensions of the LVQ family of algorithms substitute the Euclidean distance with more general metric structures such as: weighted Euclidean metrics [18], adaptive relevance matrix metrics [61], pseudo-Euclidean metrics [22], and similarity measures in kernel feature space that lead to kernelized versions of LVQ [52].…”
Section: Introductionmentioning
confidence: 99%
“…We have proposed a new algorithm called MLVQ1 (Modified Learning Vector Quantization), which is based on the classic LVQ algorithm (Qin, Suganthan, & Liang, 2004) and which is used during the training step (Sayoud, 2003).…”
Section: New Prosodic Methodsmentioning
confidence: 99%
“…LVQ1 is one of the LVQ algorithms (Qin, Suganthan, & Liang, 2004) used in classification. So, assume that a number of 'codebook vectors' m i (free parameter vectors) are placed into the input space to approximate various domains of the input vector x by their quantized values.…”
Section: Note Thatmentioning
confidence: 99%
“…The NN decision rule assigns to an unclassied sample point the class of the nearest of a set of previously classied points [162]. The prototype based learning algorithm provides a simple and intuitive model while promising generalization performance in pattern classication tasks [163].…”
Section: Nearest Neighbor (Nn)mentioning
confidence: 99%