2019
DOI: 10.48550/arxiv.1902.10197
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space

Zhiqing Sun,
Zhi-Hong Deng,
Jian-Yun Nie
et al.

Abstract: We study the problem of learning representations of entities and relations in knowledge graphs for predicting missing links. The success of such a task heavily relies on the ability of modeling and inferring the patterns of (or between) the relations. In this paper, we present a new approach for knowledge graph embedding called RotatE, which is able to model and infer various relation patterns including: symmetry/antisymmetry, inversion, and composition. Specifically, the RotatE model defines each relation as … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
384
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 180 publications
(386 citation statements)
references
References 15 publications
2
384
0
Order By: Relevance
“…authorship relations or citation links). In order to address this issue, the authors present a reimplementation of TransE [4] and RotatE [25] by using a newly proposed loss function optimized for many-to-many relations, i.e. Soft Margin (SM) Loss.…”
Section: Knowledge Graph Embeddings and Scholarly Datamentioning
confidence: 99%
“…authorship relations or citation links). In order to address this issue, the authors present a reimplementation of TransE [4] and RotatE [25] by using a newly proposed loss function optimized for many-to-many relations, i.e. Soft Margin (SM) Loss.…”
Section: Knowledge Graph Embeddings and Scholarly Datamentioning
confidence: 99%
“…We employ self-adversarial negative sampling [5] method to generate corrupt samples. We define the probability distribution of negative samples by:…”
Section: Loss Functionmentioning
confidence: 99%
“…We compared our model with several state-of-the-art baselines. For transformation based models, we compared our model to TransE [4], TorusE [24], RotatE [5] and HAKE [7]; for bilinear models, we compared our model to ComplEx [13], HolE [12], SimplE [9], DihEdral [14] and QuatE [6] (to make the comparison fair, we use the version of QuatE without type constraints on the common link prediction datasets considering the requirement of type constraints is too strong).…”
Section: Baselinesmentioning
confidence: 99%
See 1 more Smart Citation
“…The key idea is to transform entities and relations in triple into continuous vector space. ComplEx [14] introduces complex embeddings so as to better model asymmetric relations and RotatE [13] further infers the composition pattern.…”
Section: Related Workmentioning
confidence: 99%