Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing 2018
DOI: 10.18653/v1/d18-1223
|View full text |Cite
|
Sign up to set email alerts
|

One-Shot Relational Learning for Knowledge Graphs

Abstract: Knowledge graphs (KGs) are the key components of various natural language processing applications. To further expand KGs' coverage, previous studies on knowledge graph completion usually require a large number of training instances for each relation. However, we observe that long-tail relations are actually more common in KGs and those newly added relations often do not have many known triples for training. In this work, we aim at predicting new facts under a challenging setting where only one training instanc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
160
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 183 publications
(171 citation statements)
references
References 38 publications
1
160
0
Order By: Relevance
“…To do few-shot link prediction, Xiong et al (2018) made the first trial and proposed GMatching, learning a matching metric by considering both learned embeddings and one-hop graph structures, while we try to accomplish few-shot link prediction from another perspective based on the intuition that the most important information to be transferred from a few existing instances to incomplete triples should be the common and shared knowledge within one task. We call such information relation-specific meta information and propose a new framework Meta Relational Learning (MetaR) for few-shot link prediction.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…To do few-shot link prediction, Xiong et al (2018) made the first trial and proposed GMatching, learning a matching metric by considering both learned embeddings and one-hop graph structures, while we try to accomplish few-shot link prediction from another perspective based on the intuition that the most important information to be transferred from a few existing instances to incomplete triples should be the common and shared knowledge within one task. We call such information relation-specific meta information and propose a new framework Meta Relational Learning (MetaR) for few-shot link prediction.…”
Section: Introductionmentioning
confidence: 99%
“…Compared with GMatching (Xiong et al, 2018) which relies on a background knowledge graph, our MetaR is independent with them, thus is more robust as background knowledge graphs might not be available for few-shot link prediction in real scenarios.…”
Section: Introductionmentioning
confidence: 99%
“…KG-BERT [46] takes entity and relation descriptions of a triple as input and computes scoring function of the triple with BERT. For the low resource setting, [44] proposed a one-shot relational learning framework, which learns a matching metric by considering both the learned embeddings and one-hop graph structures. [6] proposed a Meta Relational Learning (MetaR) framework to do few-shot link prediction in KGs.…”
Section: Related Workmentioning
confidence: 99%
“…In this paper, we test Graph Convolution Network (GCN) and Onto2vec. There are other node embedding methods, but GCN has shown to work well in practice for prediction tasks when labels have low occurrence frequencies [9,19,27].…”
Section: Entity Encodersmentioning
confidence: 99%
“…For example, distance metric for GO terms can be inferred from the GO tree or their definitions, and then be used as the intuitive constraint that forces similar terms to have equivalent prediction probabilities for a given protein sequence [24]. More importantly, for rare label prediction problems, works in other research domains have shown that using vector representations of labels as one of the features can boost the classification accuracy [2,19,27]. This paper will focus on the second data resource, the Gene Ontology itself.…”
Section: Introductionmentioning
confidence: 99%