Findings of the Association for Computational Linguistics: EMNLP 2021 2021
DOI: 10.18653/v1/2021.findings-emnlp.42
|View full text |Cite
|
Sign up to set email alerts
|

Hyperbolic Geometry is Not Necessary: Lightweight Euclidean-Based Models for Low-Dimensional Knowledge Graph Embeddings

et al.

Abstract: Recent knowledge graph embedding (KGE) models based on hyperbolic geometry have shown great potential in a low-dimensional embedding space. However, the necessity of hyperbolic space in KGE is still questionable, because the calculation based on hyperbolic geometry is much more complicated than Euclidean operations. In this paper, based on the state-of-the-art hyperbolic-based model RotH, we develop two lightweight Euclideanbased models, called RotL and Rot2L. The RotL model simplifies the hyperbolic operation… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 11 publications
(21 citation statements)
references
References 19 publications
0
14
0
Order By: Relevance
“…where Although circular rotation is theoretically able to infer symmetric patterns [36] (i.e., by setting rotation angle 𝜃 = 𝜋 or 𝜃 = 0), circular reflection can more effectively represent symmetric relations since their second power is the identity. AttH [36] combines circular rotations and circular reflections by using an attention mechanism learned in the tangent space, which requires additional parameters. We also combine circular rotation and circular reflection operators but in a different way.…”
Section: Relation Parameterizationmentioning
confidence: 99%
“…where Although circular rotation is theoretically able to infer symmetric patterns [36] (i.e., by setting rotation angle 𝜃 = 𝜋 or 𝜃 = 0), circular reflection can more effectively represent symmetric relations since their second power is the identity. AttH [36] combines circular rotations and circular reflections by using an attention mechanism learned in the tangent space, which requires additional parameters. We also combine circular rotation and circular reflection operators but in a different way.…”
Section: Relation Parameterizationmentioning
confidence: 99%
“…We further improve the Hardness-aware Activation by multiplying the similarity score with a relation-specific trainable parameter. Applied by previous low-dimensional KGE models, this technology is beneficial to encoding hierarchical relationships and improves prediction accuracy [28]. Finally, given a query (𝑒, 𝑟 ) and its target entity 𝑒 𝑝 , the triple score based on 𝐿 2 distance via the Hardness-aware Activation is defined as:…”
Section: Hardness-aware Activation Mechanismmentioning
confidence: 99%
“…To verify the performance of our HaLE framework, we select five representative KGE models: TransE [3], DistMult [34], RotatE [22], RotE [5], and RotL [28]. These models utilize five different transform functions to generate the query vector in the Euclidean space, which are formulated as follows:…”
Section: The Hale Frameworkmentioning
confidence: 99%
See 2 more Smart Citations