The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2021
DOI: 10.1007/978-3-030-77385-4_25
|View full text |Cite
|
Sign up to set email alerts
|

RETRA: Recurrent Transformers for Learning Temporally Contextualized Knowledge Graph Embeddings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 14 publications
0
3
0
Order By: Relevance
“…The notion of modeling relations as operations in some vector space was further extended by Wang et [19]. Over the time even more complex neural architectures were employed, e.g., capsule networks in CapsE by Nguyen et al [63], recurrent skipping networks by Guo et al [33], graph convolutional networks in GCN-Align by Wang et al [93] or R-GCN by Schlichtkrull et al [82], recurrent transformers by Werner et al [95] Yet another approach was taken in RESCAL by Nickel et al where tensor-based techniques were used [65]. Similar approaches were taken in TOAST by Jachnik et al [39], TATEC by García-Durán et al [28], DistMult by Yang et al [97], HolE by Nickel et al [64], ComplEx by Trouillon et al [88], or ANALOGY by Liu et al [52].…”
Section: Knowledge Base Embeddingsmentioning
confidence: 99%
“…The notion of modeling relations as operations in some vector space was further extended by Wang et [19]. Over the time even more complex neural architectures were employed, e.g., capsule networks in CapsE by Nguyen et al [63], recurrent skipping networks by Guo et al [33], graph convolutional networks in GCN-Align by Wang et al [93] or R-GCN by Schlichtkrull et al [82], recurrent transformers by Werner et al [95] Yet another approach was taken in RESCAL by Nickel et al where tensor-based techniques were used [65]. Similar approaches were taken in TOAST by Jachnik et al [39], TATEC by García-Durán et al [28], DistMult by Yang et al [97], HolE by Nickel et al [64], ComplEx by Trouillon et al [88], or ANALOGY by Liu et al [52].…”
Section: Knowledge Base Embeddingsmentioning
confidence: 99%
“…Contextual Knowledge Graph Embeddings Whereas our approach extracts the contextual views in a previous step before the actual knowledge graph embedding, there exist works that create contextualized KG embeddings based on the full KG. Werner et al [50] introduced a KG embedding over temporal contextualized KG facts. Their recurrent transformer enables to transform global KGEs into contextual embeddings, given the situation-specific factors of the relation and the subjective history of the entity.…”
Section: Related Workmentioning
confidence: 99%
“…Similar approaches representing the context in a driving scenario are shown in [24,26,38,74]. Ontologies have also been used for context-dependent recommendation tasks [108,40].…”
Section: Knowledge Representation Learningmentioning
confidence: 99%