Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2022
DOI: 10.18653/v1/2022.acl-long.402
|View full text |Cite
|
Sign up to set email alerts
|

RotateQVS: Representing Temporal Information as Rotations in Quaternion Vector Space for Temporal Knowledge Graph Completion

Abstract: Temporal factors are tied to the growth of facts in realistic applications, such as the progress of diseases and the development of political situation, therefore, research on Temporal Knowledge Graph (TKG) attracks much attention. In TKG, relation patterns inherent with temporality are required to be studied for representation learning and reasoning across temporal facts. However, existing methods can hardly model temporal relation patterns, nor can capture the intrinsic connections between relations when evo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 23 publications
(12 citation statements)
references
References 25 publications
0
10
0
Order By: Relevance
“…Embeddings for temporal knowledge graphs must account for temporal facts including time components and express corresponding temporal patterns. Four kinds of logical patterns, symmetric, inverse, asymmetric and evolve are mostly considered and studied in existing TKGE models (Chen et al 2022;Xu et al 2020). However, their definitions either neglect time information or merely consider patterns when facts happen at the same time.…”
Section: Theoretical Analysis On Temporal Patternsmentioning
confidence: 99%
See 1 more Smart Citation
“…Embeddings for temporal knowledge graphs must account for temporal facts including time components and express corresponding temporal patterns. Four kinds of logical patterns, symmetric, inverse, asymmetric and evolve are mostly considered and studied in existing TKGE models (Chen et al 2022;Xu et al 2020). However, their definitions either neglect time information or merely consider patterns when facts happen at the same time.…”
Section: Theoretical Analysis On Temporal Patternsmentioning
confidence: 99%
“…Existing embedding approaches such as TeRO, Rotate-QVS, and TLT-KGE (Xu et al 2020;Chen et al 2022;Zhang et al 2022) resorted to single underlying embedding spaces, such as Complex space or Quaternion space to model symmetric patterns by the rotations on a unit hypersphere. Other works (Chami et al 2020;Balazevic, Allen, and Hospedales 2019;Montella, Barahona, and Heinecke 2021;Han et al 2020) use hyperbolic space to preserve hierarchical patterns in temporal KGs.…”
Section: Introductionmentioning
confidence: 99%
“…where 1 ≤ i ≤ l. For ease of computation, we map Q r ð⋅Þ ∈ B n;c r to the tangent space of the origin of Poincaré ball model shared by all c r for r ∈ R according to the exponential map and logarithmic map in Equation (3). Given that cooccurrence events are consistent and structurally dependant, we leverage mean-pooling to aggregate information from events involving o.…”
Section: Embedding Via Event Interactionsmentioning
confidence: 99%
“…To represent knowledge that is time-dependent and evolving, previous methods [1][2][3] attempted to encode quadruplets with time-specific embeddings of entities, relations and timestamps. Extending the scope of knowledge representation, these methods capture rich semantic patterns including symmetry, asymmetry, inversion and time evolution to some extent.…”
Section: Introductionmentioning
confidence: 99%
“…By comparison with embedding-based KGE approaches, text-based methods incorporate available texts for KGE. With the development of Pre-trained Language Models (PLMs), many text-based models Saxena et al, 2022;Kim et al, 2020;Markowitz et al, 2022;Chen et al, 2022a; have been proposed, which can obtain promising performance and take advantage of allocating a fixed memory footprint for largescale real-world KGs. Recently, large language models (LLMs) (e.g., GPT-3 (Brown et al, 2020), ChatGPT (OpenAI, 2022)) further demonstrated the ability to perform a variety of natural language processing (NLP) tasks without adaptation, providing potential opportunities of better knowledge representations.…”
Section: Introductionmentioning
confidence: 99%