2017
DOI: 10.1609/aaai.v31i1.10952
|View full text |Cite
|
Sign up to set email alerts
|

SSP: Semantic Space Projection for Knowledge Graph Embedding with Text Descriptions

Abstract: Knowledge graph embedding represents entities and relations in knowledge graph as low-dimensional, continuous vectors, and thus enables knowledge graph compatible with machine learning models. Though there have been a variety of models for knowledge graph embedding, most methods merely concentrate on the fact triples, while supplementary textual descriptions of entities and relations have not been fully employed. To this end, this paper proposes the semantic space projection (SSP) model which jointly learns fr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
23
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 81 publications
(23 citation statements)
references
References 16 publications
0
23
0
Order By: Relevance
“…Knowledge Graph Representation Learning Representation learning has shown great success in a wide range of fields (Zhang et al , 2017Li et al 2021;Yuan et al 2021) and KG reasoning is no exception. Recent advances in this area have proposed a variety of embeddingbased methods that project the entities and relations into low-dimensional continuous vector space by exploiting entity types (Guo et al 2015;Ouyang et al 2017), relation paths (Lin et al 2015a;Toutanova et al 2016;Li et al 2018a;Zhang et al 2018), textual descriptions (Zhong et al 2015;Xiao et al 2017), and logical rules (Omran, Wang, and Wang 2018;Hamilton et al 2018). For instance, TransE (Wang et al 2014) first encoded the entities and relations into latent vectors by following translational principle in point-wise Euclidean space.…”
Section: Related Workmentioning
confidence: 99%
“…Knowledge Graph Representation Learning Representation learning has shown great success in a wide range of fields (Zhang et al , 2017Li et al 2021;Yuan et al 2021) and KG reasoning is no exception. Recent advances in this area have proposed a variety of embeddingbased methods that project the entities and relations into low-dimensional continuous vector space by exploiting entity types (Guo et al 2015;Ouyang et al 2017), relation paths (Lin et al 2015a;Toutanova et al 2016;Li et al 2018a;Zhang et al 2018), textual descriptions (Zhong et al 2015;Xiao et al 2017), and logical rules (Omran, Wang, and Wang 2018;Hamilton et al 2018). For instance, TransE (Wang et al 2014) first encoded the entities and relations into latent vectors by following translational principle in point-wise Euclidean space.…”
Section: Related Workmentioning
confidence: 99%
“…Even though there are a large number of successful researches in modeling relational facts, most of them can only train an embedding model on an observed triplets dataset. Thereupon, there are increasing studies that focus on learning more generalizing KG embedding models by absorbing additional information, such as entity types [28,29], relation paths [30][31][32], and textual descriptions [33][34][35].…”
Section: Elon Musk Founderof Locatedinmentioning
confidence: 99%
“…Most of the currently available techniques perform the embedding task based solely on triples observed in a KG. Some recent work further tried to use other information, e.g., entity types (Guo et al 2015; and textual descriptions Xiao, Huang, and Zhu 2017), to learn more predictive embeddings. See (Wang et al 2017) for a thorough review of KG embedding techniques.…”
Section: Related Workmentioning
confidence: 99%