2014
DOI: 10.1109/tpami.2014.2316836
|View full text |Cite
|
Sign up to set email alerts
|

Spherical and Hyperbolic Embeddings of Data

Abstract: Abstract-Many computer vision and pattern recognition problems may be posed as the analysis of a set of dissimilarities between objects. For many types of data, these dissimilarities are not Euclidean (i.e. they do not represent the distances between points in a Euclidean space), and therefore cannot be isometrically embedded in a Euclidean space. Examples include shape-dissimilarities, graph distances and mesh geodesic distances. In this paper, we provide a means of embedding such non-Euclidean data onto surf… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
64
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 91 publications
(66 citation statements)
references
References 31 publications
1
64
0
Order By: Relevance
“…Let P M (θ) be a probability distribution with support on M and parametrised by vector θ. Given the tangent plane T x M ∈ R d at x, a general approach to compute P M (θ) is to take a probability distribution P (θ) with support on T x M, and compute P M (θ) as the push-forward distribution of P (θ) through the Riemannian exponential map (exp-map) Exp x (·) [16,6]. Intuitively, a sample from P M (θ) is obtained by first sampling from P (θ) and then mapping the sample to M using the exp-map.…”
Section: Priors On Ccmsmentioning
confidence: 99%
See 2 more Smart Citations
“…Let P M (θ) be a probability distribution with support on M and parametrised by vector θ. Given the tangent plane T x M ∈ R d at x, a general approach to compute P M (θ) is to take a probability distribution P (θ) with support on T x M, and compute P M (θ) as the push-forward distribution of P (θ) through the Riemannian exponential map (exp-map) Exp x (·) [16,6]. Intuitively, a sample from P M (θ) is obtained by first sampling from P (θ) and then mapping the sample to M using the exp-map.…”
Section: Priors On Ccmsmentioning
confidence: 99%
“…In the more specific area of CCMs, [1,11] propose hyperspherical variational autoencoders to better model data on a hypersphere, whereas [5,12] exploit CCMs with different curvatures to perform change detection on sequences of graphs. Several other works, however, show how different types of data can greatly benefit from being embedded on CCMs, with literature going as far back as [13], and the more recent notable contributions of [2] and [6]. However, since the application of non-Euclidean geometry to deep representation learning is a fairly recent development in machine learning [7], most methods are specific to only one type of CCM (i.e., either hyperbolic or hyperspherical) and a unified model dealing with the general family of CCMs is still missing.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…, x t ∈ M k . By following [11], we consider a symmetric distance measure d(·, ·) operating in G and propose to adopt an embedding aiming at preserving the distances, i.e., ρ κ (f (g 1 ), f (g 2 )) ≈ d(g 1 , g 2 ) ∀g 1 , g 2 ∈ G.…”
Section: Embedding Based On Dissimilarity Matrixmentioning
confidence: 99%
“…Many relevant machine learning applications require to go beyond conventional Euclidean geometry, as in the case of data described by attributed graphs [1], [2]. When studying problems on graphs, one of the key issues is to find representations that allow dealing with their underlying geometry, which is usually defined by application-specific distances that often do not satisfy the identity of indiscernibles or the triangular inequality [3], [4]. The use of metric distances, like graph alignment distances [5], only mitigates the problem, as they are computationally intractable and hence not useful in practical applications.…”
Section: Introductionmentioning
confidence: 99%