2015
DOI: 10.1090/proc/12953
|View full text |Cite
|
Sign up to set email alerts
|

Positive definite matrices and the S-divergence

Abstract: Abstract. Positive definite matrices abound in a dazzling variety of applications. This ubiquity can be in part attributed to their rich geometric structure: positive definite matrices form a selfdual convex cone whose strict interior is a Riemannian manifold. The manifold view is endowed with a "natural" distance function while the conic view is not. Nevertheless, drawing motivation from the conic view, we introduce the S-Divergence as a "natural" distance-like function on the open cone of positive definite m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
139
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 109 publications
(140 citation statements)
references
References 35 publications
1
139
0
Order By: Relevance
“…(As a matter of curiosity we mention that in [3] it was conjectured that δ S not a metric, shortly after that in [2] the opposite was claimed, and finally, Sra has shown that δ S is indeed a true metric on P n .) In [15] he has pointed out the importance of this new distance function. Among others, he has emphasized that δ S is a useful substitute of the widely applied geodesic distance δ R , it respects a non-Euclidean geometry of a rather similar kind, but, compared to the case of δ R , the calculation of δ S is easier, it is much less time and capacity demanding which is a really considerable advantage from the computational points view.…”
Section: Introduction and Statement Of The Resultsmentioning
confidence: 99%
“…(As a matter of curiosity we mention that in [3] it was conjectured that δ S not a metric, shortly after that in [2] the opposite was claimed, and finally, Sra has shown that δ S is indeed a true metric on P n .) In [15] he has pointed out the importance of this new distance function. Among others, he has emphasized that δ S is a useful substitute of the widely applied geodesic distance δ R , it respects a non-Euclidean geometry of a rather similar kind, but, compared to the case of δ R , the calculation of δ S is easier, it is much less time and capacity demanding which is a really considerable advantage from the computational points view.…”
Section: Introduction and Statement Of The Resultsmentioning
confidence: 99%
“…We will compare against six other methods, namely (i) log-Euclidean sparse coding (LE-SC) [18] that projects the data into the Log-Euclidean symmetric space, followed by sparse coding the matrices as Euclidean objects, (ii) Frob-SC, in which the manifold structure is discarded, (iii) Stein-Kernel-SC [20] using a kernel defined by the symmetric Stein divergence [21], (iv) Log-Euclidean Kernel-SC which is similar to (iii) but uses the log-Euclidean kernel [23], (v) tensor sparse coding (TSC) [15] which uses the log-determinant divergence, and generalized dictionary learning (GDL) [16].…”
Section: Comparison Methodsmentioning
confidence: 99%
“…In [20], a kernelized sparse coding scheme is presented for SPD matrices using the Stein divergence [21] for generating the underlying kernel function. But this divergence does not induce a kernel for all bandwidths.…”
Section: Related Workmentioning
confidence: 99%
“…We can also use the recently introduced Stein divergence [Sra12] to determine simi-larities between points on the SPD manifold. Its symmetrised form is:…”
Section: Stein Divergencementioning
confidence: 99%
“…Very recently, Sra et al introduced the Stein kernel using Bregman matrix divergence as follows [Sra12]:…”
Section: Introductionmentioning
confidence: 99%