2021
DOI: 10.1007/978-3-030-88361-4_24
|View full text |Cite
|
Sign up to set email alerts
|

Improving Knowledge Graph Embeddings with Ontological Reasoning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
26
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 20 publications
(27 citation statements)
references
References 26 publications
0
26
0
Order By: Relevance
“…The possibility of using additional semantic information has been extensively studied in recent works [ Jain et al , 2021a, Krompaß et al , 2015, Niu et al , 2020, Cui et al , 2021, Xie et al , 2016, Lv et al , 2018, Wang et al , 2021b. In general, the semantic information stems directly from an ontology, originally defined by Gruber as an "explicit specification of a conceptualization" [ Gruber , 1993 ].…”
Section: Combining Embeddings and Semanticsmentioning
confidence: 99%
See 2 more Smart Citations
“…The possibility of using additional semantic information has been extensively studied in recent works [ Jain et al , 2021a, Krompaß et al , 2015, Niu et al , 2020, Cui et al , 2021, Xie et al , 2016, Lv et al , 2018, Wang et al , 2021b. In general, the semantic information stems directly from an ontology, originally defined by Gruber as an "explicit specification of a conceptualization" [ Gruber , 1993 ].…”
Section: Combining Embeddings and Semanticsmentioning
confidence: 99%
“…Training a Knowledge Graph Embedding Model (KGEM) firstly requires corrupting existing triples by replacing either their head h or their tail t with another entity to generate negative counterparts. This procedure is called negative sampling [ Bordes et al , 2013, Krompaß et al , 2015, Jain et al , 2021a.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, more sophisticated methods have been proposed to generate high-quality negative samples and consequently give more hints to the model training [30]. In particular, ontological constraints and domain knowledge can be leveraged to create meaningful negative samples [10,13,26]. Intuitively, generating more realistic and robust negative samples helps the embedding model learn a better vector representation of the graph components.…”
Section: Negative Sampling For Link Prediction In Knowledge Graphsmentioning
confidence: 99%
“…In fact, [4] states that most existing KGEs are not capable of encoding ontological information. [5] use ontological information but only for improving the negative samples that KGEs require.…”
Section: Introductionmentioning
confidence: 99%