Proceedings of the Web Conference 2020 2020
DOI: 10.1145/3366423.3380089
|View full text |Cite
|
Sign up to set email alerts
|

Relation Adversarial Network for Low Resource Knowledge Graph Completion

Abstract: Knowledge Graph Completion (KGC) has been proposed to improve Knowledge Graphs by filling in missing connections via link prediction or relation extraction. One of the main difficulties for KGC is a low resource problem. Previous approaches assume sufficient training triples to learn versatile vectors for entities and relations, or a satisfactory number of labeled sentences to train a competent relation extraction model. However, low resource relations are very common in KGs, and those newly added relations of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
40
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3
1

Relationship

3
6

Authors

Journals

citations
Cited by 72 publications
(40 citation statements)
references
References 48 publications
(64 reference statements)
0
40
0
Order By: Relevance
“…), previous studies utilize KG embeddings or textual encoding to represent triple and leverage a pre-defined scoring function to those vectors. With pretrained encoders, those approaches such as KG-BERT [Yao et al, 2019] are generalizable enough and robust to the incompleteness; however, they have to costly scoring of all possible triples in inference. In this paper, we simply leverage masked entity modeling for link prediction, which makes the model predict the correct entity e k like the Masked Language Model (MLM) task.…”
Section: Contextualized Kg Representationmentioning
confidence: 99%
See 1 more Smart Citation
“…), previous studies utilize KG embeddings or textual encoding to represent triple and leverage a pre-defined scoring function to those vectors. With pretrained encoders, those approaches such as KG-BERT [Yao et al, 2019] are generalizable enough and robust to the incompleteness; however, they have to costly scoring of all possible triples in inference. In this paper, we simply leverage masked entity modeling for link prediction, which makes the model predict the correct entity e k like the Masked Language Model (MLM) task.…”
Section: Contextualized Kg Representationmentioning
confidence: 99%
“…Knowledge Graphs (KGs) organize facts in a structured way as triples in the form of <subject, predicate, object>, abridged as (s, p, o), where s and o denote entities and p builds relations between entities. Most KGs are far from complete due to the emerging entities and their relations in real-world applications; hence KG completion-the problem of extending a KG with missing triples-has appeal to researchers [Lin et al, 2015;Zhang et al, 2019a;Zhang et al, 2020;Zhang et al, 2021a;Qi et al, 2021;Zhang et al, 2021b].…”
Section: Introductionmentioning
confidence: 99%
“…• How to help models? We could use our approach of NAL to increase the weight of the knowledge learned in the pre-training task or leverage external knowledge (Zhang et al, 2019(Zhang et al, , 2020bYu et al, 2020;Zhang et al, 2020a).…”
Section: Case Studymentioning
confidence: 99%
“…Relational Triple Extraction is an essential task in Information Extraction (IE) for Natural Language Processing (NLP) and Knowledge Graph (KG) (Zhang et al, 2018b;Yu et al, 2017;Nan et al, 2020;Zhang et al, 2020a;Ye et al, 2020;Zhang et al, 2020b), which is aimed at detecting a pair of entities along with their relations from unstructured text. For instance, there is a sentence "Paris is known as the romantic capital of France.…”
Section: Relational Triple Extractionmentioning
confidence: 99%