2014
DOI: 10.1007/978-3-319-12084-3_9
|View full text |Cite
|
Sign up to set email alerts
|

Distance Measures for Prototype Based Classification

Abstract: Abstract. The basic concepts of distance based classification are introduced in terms of clear-cut example systems. The classical k-NearestNeigbhor (kNN) classifier serves as the starting point of the discussion. Learning Vector Quantization (LVQ) is introduced, which represents the reference data by a few prototypes. This requires a data driven training process; examples of heuristic and cost function based prescriptions are presented. While the most popular measure of dissimilarity in this context is the Euc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
30
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 30 publications
(33 citation statements)
references
References 39 publications
(55 reference statements)
0
30
0
Order By: Relevance
“…LVQ can be extended by the powerful concept of metric learning, which got popular in distance-based classification lately [29], [30], [31]. Especially, there exists a generalisation of the GLVQ towards a general quadratic form (x−w j ) T Λ(x− w j ) with positive semi-definite matrix Λ which is proposed under the acronym GMLVQ [27].…”
Section: Learning Vector Quantizationmentioning
confidence: 99%
See 1 more Smart Citation
“…LVQ can be extended by the powerful concept of metric learning, which got popular in distance-based classification lately [29], [30], [31]. Especially, there exists a generalisation of the GLVQ towards a general quadratic form (x−w j ) T Λ(x− w j ) with positive semi-definite matrix Λ which is proposed under the acronym GMLVQ [27].…”
Section: Learning Vector Quantizationmentioning
confidence: 99%
“…We will rely on powerful formalisations of LVQ learning in terms of cost functions, such as proposed in the approaches [25], [26], [27], and we will use its recent extensions to incremental learning like the incremental, online LVQ (ioLVQ) [28]. Notably, state of the art LVQ classifiers usually integrate the powerful concept of metric learning [29], [30], [31]. For a comparison we refer to state of the art incremental learners, such as incremental support vector machines (iSVM)…”
Section: Introductionmentioning
confidence: 99%
“…Several attempts were proposed to make progress regarding this problem ranging from intelligent initialization to the harmonic to minimum LVQ algorithm (H2M-LVQ, [71]). This latter approach starts with a different cost function compared to GLVQ incorporating the harmonic average distance instead of d + and d − in (9). According to this average, the whole distance information between the presented data sample and all prototypes is taken into account, which reduces the initialization sensitivity.…”
Section: Robustness Classification Certainty and Border Sensitivitymentioning
confidence: 99%
“…Unfortunately, original GLVQ as proposed in [77] does not optimize the classification error rather than the cost function E GLV Q (W, V ) from (9). Hence, the performance cannot judged consistently neither in terms of the statistical quantities provided by the confusion matrix nor by the ROC analysis.…”
Section: Generative Versus Discriminative Models Asymmetric Error Asmentioning
confidence: 99%
See 1 more Smart Citation