2006
DOI: 10.1016/j.neunet.2006.05.010
|View full text |Cite
|
Sign up to set email alerts
|

Performance analysis of LVQ algorithms: A statistical physics approach

Abstract: Learning vector quantization (LVQ) constitutes a powerful and intuitive method for adaptive nearest prototype classification. However, original LVQ has been introduced based on heuristics and numerous modifications exist to achieve better convergence and stability. Recently, a mathematical foundation by means of a cost function has been proposed which, as a limiting case, yields a learning rule similar to classical LVQ2.1. It also motivates a modification which shows better stability. However, the exact dynami… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
45
0
2

Year Published

2008
2008
2022
2022

Publication Types

Select...
5
4
1

Relationship

2
8

Authors

Journals

citations
Cited by 26 publications
(49 citation statements)
references
References 14 publications
(16 reference statements)
2
45
0
2
Order By: Relevance
“…Also for N → ∞ the rescaled quantity t ≡ µ/N can be conceived as a continuous time variable. Accordingly, the dynamics can be described by a set of coupled ODE [3,10] after performing an average over the sequence of input data:…”
Section: Analysis Of Learning Dynamicsmentioning
confidence: 99%
“…Also for N → ∞ the rescaled quantity t ≡ µ/N can be conceived as a continuous time variable. Accordingly, the dynamics can be described by a set of coupled ODE [3,10] after performing an average over the sequence of input data:…”
Section: Analysis Of Learning Dynamicsmentioning
confidence: 99%
“…Also for N → ∞, the rescaled quantity α ≡ μ/N can be conceived as a continuous time variable. Accordingly, the dynamics can be described by a set of coupled ordinary differential equations (ODE) (Ghosh, Biehl, & Hammer, 2006) after performing an average over the sequence of input data:…”
Section: Discussionmentioning
confidence: 99%
“…Ghosh Biehl and Hammer 2006 studied five LVQ algorithms in detail: Kohonen's original LVQ1, unsupervised vector quantization (VQ), a mixture of VQ and LVQ, LVQ2.1, and a variant of LVQ which is based on a cost function. Surprisingly, basic LVQ1 showed very good performance in terms of stability, asymptotic generalization ability, and robustness to initializations and model parameters which, in many cases, was superior to recent alternative proposals [8] . Sanchez and Marques 2006 introduced an adaptive algorithm for competitive training of a nearest neighbor (NN) classifier when using a very small codebook, which was based on the well-known LVQ method, and used an alternative neighborhood concept to estimate optimal locations of the codebook vectors [9] .…”
Section: Introductionmentioning
confidence: 92%