2009
DOI: 10.1109/tpami.2008.290
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Discriminant Analysis for Positive Definite and Indefinite Kernels

Abstract: Kernel methods are a class of well established and successful algorithms for pattern analysis thanks to their mathematical elegance and good performance. Numerous nonlinear extensions of pattern recognition techniques have been proposed so far based on the so-called kernel trick. The objective of this paper is twofold. First, we derive an additional kernel tool that is still missing, namely kernel quadratic discriminant (KQD). We discuss different formulations of KQD based on the regularized kernel Mahalanobis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
69
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 99 publications
(69 citation statements)
references
References 23 publications
0
69
0
Order By: Relevance
“…One approach to describe the data is to estimate a normal distribution in feature space H induced via a mapping Φ : X → H. It has been shown by Pȩkalska et al [13] that computing the variance term in GP regression is equal to the Mahalanobis distance (to the data mean in feature space) if the regularized (σ n > 0) kernel-induced scaling matrix Σ = Φ(X) Φ(X) T + σ 2 n I is used:…”
Section: Predictive Variance Models a Gaussian In Feature Spacementioning
confidence: 99%
“…One approach to describe the data is to estimate a normal distribution in feature space H induced via a mapping Φ : X → H. It has been shown by Pȩkalska et al [13] that computing the variance term in GP regression is equal to the Mahalanobis distance (to the data mean in feature space) if the regularized (σ n > 0) kernel-induced scaling matrix Σ = Φ(X) Φ(X) T + σ 2 n I is used:…”
Section: Predictive Variance Models a Gaussian In Feature Spacementioning
confidence: 99%
“…Another family of useful npd kernels are the compact support (cs) kernels [16]. Popular nonEuclidean (nonmetric) similarities/dissimilarities, such as Hausdorff distances [17] and Kullback-Leibler divergence between probability distributions, can be used to define npd kernels [18,19]. Hence, there is both practical and theoretical need to properly handle all these measures and npd kernels in order to extract discriminant features using an KDA framework with npd kernels.…”
Section: Introductionmentioning
confidence: 99%
“…In particular in [21] a geometrical interpretation of learning a large margin classifier with indefinite kernels has been discussed. In [21] classification frameworks based on two-class Kernel Fisher Discriminant Analysis (KFDA) and in [18] Kernel Quadratic Discriminant (KQD) analysis with indefinite kernels were proposed. In this paper we study feature extraction with npd (or simple indefinite) kernels.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations