2013
DOI: 10.1007/978-3-642-41822-8_16
|View full text |Cite
|
Sign up to set email alerts
|

On the Generalization of the Mahalanobis Distance

Abstract: The Mahalanobis distance (MD) is a widely used measure in Statistics and Pattern Recognition. Interestingly, assuming that the data are generated from a Gaussian distribution, it considers the covariance matrix to evaluate the distance between a data point and the distribution mean. In this work, we generalize MD for distributions in the exponential family, providing both, a definition in terms of the data density function and a computable version. We show its performance on several artificial and real data sc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
7
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 10 publications
0
6
0
Order By: Relevance
“…Outliers were defined to facilitate the development of models that would (statistically) reliably describe the direction and dynamics of LUC within each set. It was achieved with an outlier criterion based on the Mahalanobis distance, which is a measure of dissimilarity of multi-attribute objects and a metric commonly used in cluster analysis [70,71]. The literature offers many modifications of the Mahalanobis distance [72] that can be used to identify outliers [73].…”
Section: Methodsmentioning
confidence: 99%
“…Outliers were defined to facilitate the development of models that would (statistically) reliably describe the direction and dynamics of LUC within each set. It was achieved with an outlier criterion based on the Mahalanobis distance, which is a measure of dissimilarity of multi-attribute objects and a metric commonly used in cluster analysis [70,71]. The literature offers many modifications of the Mahalanobis distance [72] that can be used to identify outliers [73].…”
Section: Methodsmentioning
confidence: 99%
“…The phonetic distance between the Korean/Austrian VOTs to the American English target VOT spaces was assessed by calculating the Mahalanobis distance (Kartushina, Hervais-Adelman, Frauenfelder, & Golestani, 2015), which computes the distance of a test point from the distribution mean by considering the covariance matrix (Martos, Muñoz, & González, 2013). The Mahalanobis distance takes into account natural variability in speech production by calculating the number of standard deviations from a learner's VOT to the mean of the target spaces (computed per plosive type) derived from the American English speakers, along each principal component axis of the target spaces (Kartushina et al, 2015).…”
Section: Variables Vot Distancementioning
confidence: 99%
“…In this paper we generalize the work in [9] by introducing a family of kernels based on the underlying density function of the sample at hand that gives rise to new distances that generalize the MD. The distances proposed in this article preserve the essential property of the Mahalanobis distance: "all the points that belong to the same probability curve, that is L c (f P ) = {x|f P (x) = c} where f P is the density function of the r.v.…”
Section: S21mentioning
confidence: 99%