2020
DOI: 10.1007/978-3-030-47436-2_41
|View full text |Cite
|
Sign up to set email alerts
|

Attention-Based Aggregation Graph Networks for Knowledge Graph Information Transfer

Abstract: Knowledge graph completion (KGC) aims to predict missing information in a knowledge graph. Many existing embedding-based KGC models solve the Out-of-knowledge-graph (OOKG) entity problem (also known as zero-shot entity problem) by utilizing textual information resources such as descriptions and types. However, few works utilize the extra structural information to generate embeddings. In this paper, we propose a new zero-shot scenario: how to acquire the embedding vector of a relation that is not observed at tr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(16 citation statements)
references
References 14 publications
0
16
0
Order By: Relevance
“…To improve efficiency, some inductive methods adopt GNN to aggregate IKG neighbors to produce embeddings for OOKG entities (Hamaguchi et al, 2017;Wang et al, 2019;Bi et al, 2020;Zhao et al, 2020). These methods are effective but need relatively complex calculations.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…To improve efficiency, some inductive methods adopt GNN to aggregate IKG neighbors to produce embeddings for OOKG entities (Hamaguchi et al, 2017;Wang et al, 2019;Bi et al, 2020;Zhao et al, 2020). These methods are effective but need relatively complex calculations.…”
Section: Related Workmentioning
confidence: 99%
“…In recent years, some inductive methods have been proposed for OOKG entities without retraining. Hamaguchi et al (2017); Wang et al (2019); Bi et al (2020); Zhao et al (2020) adopt Graph Neural Networks (GNN) to aggregate the IKG neighbors to represent the OOKG entities. These methods are effective but require relatively complex calculations, which could be simplified for higher efficiency.…”
Section: Introductionmentioning
confidence: 99%
“…For instance, [4] chooses a semi-supervised strategy to predict the unseen entities based on their relations with the in-sample entities and absorb them into the KG. [273] places these operations as "transition" and "aggregation" modules.…”
Section: Graph/ontologymentioning
confidence: 99%
“…Graph modeling is a comprehensive task involving a variety of learning objects. When some objects are scarce or completely unseen, many LSL graph modeling methods attempt to exploit other supporting information to make up for the lack of partial training materials [4,226,237,264,273,274]. When the novel relations or entities have some correlation with the original elements in the KG, the metric-based method [48,139,236,260] can learn a reliable embedding space.…”
Section: Graph/ontologymentioning
confidence: 99%
“…There are also some recent works focusing on the OOKB problem in the KBC task recently.Some researchers focus on different type of tasks like link prediction [36], [37] in few-shot learning and entity detection [38] while we focus on triplet classification.Some researchers used the jointly embedding method [39] and multimodal data enhanced representation [40] to achieve OOKB entity embedding while in our work these external data are not considered.Reference [41] used attention-based aggregation to solve the new OOKB relation problem.Their idea and method are exciting and we want to extend our work to the OOKB relation problem in the future.Many studies assumed a specific scenario while our work considers only the standard scenario of the This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.…”
Section: Related Workmentioning
confidence: 99%