2019 IEEE 58th Conference on Decision and Control (CDC) 2019
DOI: 10.1109/cdc40024.2019.9029297
|View full text |Cite
|
Sign up to set email alerts
|

Low-rank approximations of hyperbolic embeddings

Abstract: The hyperbolic manifold is a smooth manifold of negative constant curvature. While the hyperbolic manifold is well-studied in the literature, it has gained interest in the machine learning and natural language processing communities lately due to its usefulness in modeling continuous hierarchies. Tasks with hierarchical structures are ubiquitous in those fields and there is a general interest to learning hyperbolic representations or embeddings of such tasks. Additionally, these embeddings of related tasks may… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 15 publications
0
9
0
Order By: Relevance
“…Proof. Using (13), the usual first order characterization of convex functions implies that, for all minimal geodesic segments γ : I → C, the composition f • γ : I → R is convex if and only if JDf (γ(t 2 )), γ ′ (t 2 ) − JDf (γ(t 1 )), γ ′ (t 1 ) (t 2 − t 1 ) ≥ 0, ∀ t 2 , t 1 ∈ I.…”
Section: Hyperbolically Convex Quadratic Functionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Proof. Using (13), the usual first order characterization of convex functions implies that, for all minimal geodesic segments γ : I → C, the composition f • γ : I → R is convex if and only if JDf (γ(t 2 )), γ ′ (t 2 ) − JDf (γ(t 1 )), γ ′ (t 1 ) (t 2 − t 1 ) ≥ 0, ∀ t 2 , t 1 ∈ I.…”
Section: Hyperbolically Convex Quadratic Functionsmentioning
confidence: 99%
“…For instance, several problems in machine learning, artificial intelligence, financial networks, as well as procrustes problems and many other practical questions can be modeled in this setting. Papers dealing with these subjects include, machine learning [20], artificial intelligence [19], neural circuits [24], low-rank approximations of hyperbolic embeddings [13,26], procrustes problems [25], financial networks [14], complex networks [15,18], embeddings of data [30] and there references therein. We also mention that there are many related papers on strain analysis, see for example, [28,31].…”
Section: Introductionmentioning
confidence: 99%
“…In Algorithm Algorithm 3 we propose a simple but suboptimal procedure to solve this low-rank approximation problem. Unlike iterative refinement algorithms based on optimization on manifolds [25], our proposed method is one-shot. It is based on the spectral factorization of the the estimated hyperbolic Gramian and involves the following steps:…”
Section: Low-rank Approximation Of H-gramiansmentioning
confidence: 99%
“…, i K ) ∈ Ω. We convert the problem (2) into a minimax problem by constructing a partial dual similar to [15]. This leads to a factorization of the tensor W in a form with separate factors for the nonnegative and low-rank constraints.…”
Section: Introductionmentioning
confidence: 99%