2011
DOI: 10.1016/j.neucom.2010.10.016
|View full text |Cite
|
Sign up to set email alerts
|

Divergence-based classification in learning vector quantization

Abstract: We discuss the use of divergences in dissimilarity based classification. Divergences can be employed whenever vectorial data consists of non-negative, potentially normalized features. This is, for instance, the case in spectral data or histograms. In particular, we introduce and study Divergence Based Learning Vector Quantization (DLVQ). We derive cost function based DLVQ schemes for the family of γ-divergences which includes the well-known Kullback-Leibler divergence and the so-called Cauchy-Schwarz divergenc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
33
0

Year Published

2011
2011
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 49 publications
(33 citation statements)
references
References 16 publications
0
33
0
Order By: Relevance
“…Obviously, the Euclidean distance is not based on a functional norm [2,3,23]. Yet, the transfer to real functional norms and distances like Sobolev norms [24,25], the Lee-norm [23,1], kernel based LVQ-approaches [26] or divergence based similarity measures [27,28], which carry the functional aspect inherently, is straightforward and topic of future investigations.…”
Section: Resultsmentioning
confidence: 99%
“…Obviously, the Euclidean distance is not based on a functional norm [2,3,23]. Yet, the transfer to real functional norms and distances like Sobolev norms [24,25], the Lee-norm [23,1], kernel based LVQ-approaches [26] or divergence based similarity measures [27,28], which carry the functional aspect inherently, is straightforward and topic of future investigations.…”
Section: Resultsmentioning
confidence: 99%
“…Adequate processing requires a precise interpretation of the differentiability, the most convenient in context of GLVQ seems to be the so-called Wirtinger calculus [44]. Divergences for density and histogram data are considered in [45,46], metrics for functional data were reported in [47,48,49]. Correlation based GLVQ as preferred frequently in biological application was introduced in [50] using the differentiability of the Pearson correlation.…”
Section: Beyond the Euclidean World -Glvq With Non-standard Dissimilamentioning
confidence: 99%
“…The incorporation of symmetric and non-symmetric, differentiable divergences into GLVQ training and classification is introduced in [37]. As an application example, the detection of Mosaic Disease in Cassava plants based on various image histograms is discussed there.…”
Section: Divergencesmentioning
confidence: 99%