2019
DOI: 10.1007/978-3-030-33226-6_21
|View full text |Cite
|
Sign up to set email alerts
|

Mixture Probabilistic Principal Geodesic Analysis

Abstract: Dimensionality reduction on Riemannian manifolds is challenging due to the complex nonlinear data structures. While probabilistic principal geodesic analysis (PPGA) has been proposed to generalize conventional principal component analysis (PCA) onto manifolds, its effectiveness is limited to data with a single modality. In this paper, we present a novel Gaussian latent variable model that provides a unique way to integrate multiple PGA models into a maximum-likelihood framework. This leads to a well-defined mi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
43
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(43 citation statements)
references
References 22 publications
(28 reference statements)
0
43
0
Order By: Relevance
“…In this work, we define a Gaussian distribution on a homogeneous space, which is a more general topological space than the symmetric space. A few others in literature have generalized the Gaussian distribution to all Riemannian manifolds, for instance, in [52], authors defined the Gaussian distribution on a Riemannian manifold without a proof to show that the normalizing factor is a constant. Whereas, in [38], the author defined the normal law on Riemannian manifolds using the concept of entropy maximization for distributions with known mean and covariance.…”
Section: Key Contributionsmentioning
confidence: 99%
“…In this work, we define a Gaussian distribution on a homogeneous space, which is a more general topological space than the symmetric space. A few others in literature have generalized the Gaussian distribution to all Riemannian manifolds, for instance, in [52], authors defined the Gaussian distribution on a Riemannian manifold without a proof to show that the normalizing factor is a constant. Whereas, in [38], the author defined the normal law on Riemannian manifolds using the concept of entropy maximization for distributions with known mean and covariance.…”
Section: Key Contributionsmentioning
confidence: 99%
“…Turning to the manifold situation, the probabilistic view implies that the fundamental problem in generalizing PPCA is not to define low-dimensional subspaces as sought by the approaches described in section 2.3 but instead to define a natural generalization of the Euclidean normal distribution to manifolds. PPCA has previously been generalized to manifolds with the probabilistic principal geodesic analysis (PPGA, [27]) procedure. The probability model for the data conditioned on the latent variables is for PPGA a Riemannian normal distribution defined via its density…”
Section: 1mentioning
confidence: 99%
“…The model is based on the Euclidean probabilistic principal component analysis procedure (PPCA, [26]) that interprets PCA as a latent variable model (1.1) with W having low rank k ≤ d. We use the PPCA approach with a probability model based on a notion of infinitesimal covariance and thereby avoid linearizing the nonlinear data space while intrinsically incorporating the effect of data anisotropy, here difference in the principal eigenvalues. The model is related to the probabilistic principal geodesic analysis (PPGA, [27]) procedure, however using the probability model and normal distributions defined in [19,24]. This construction in particular emphasizes the role of the connection on the manifold in linking infinitesimally close tangent spaces.…”
Section: Introductionmentioning
confidence: 99%
“…where f (y) = d(p i , y) 2 . Using (31), ∇π H(v) (pi) f can be computed as the magnitude of the component of −2Logπ…”
Section: Linear Difference Indicatorsmentioning
confidence: 99%