2018
DOI: 10.1587/transinf.2017edp7398
|View full text |Cite
|
Sign up to set email alerts
|

Modeling Complex Relationship Paths for Knowledge Graph Completion

Abstract: Determining the validity of knowledge triples and filling in the missing entities or relationships in the knowledge graph are the crucial tasks for large-scale knowledge graph completion. So far, the main solutions use machine learning methods to learn the low-dimensional distributed representations of entities and relationships to complete the knowledge graph. Among them, translation models obtain excellent performance. However, the proposed translation models do not adequately consider the indirect relations… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 17 publications
(18 reference statements)
0
4
0
Order By: Relevance
“…ii)Knowledge graph completion based on sequence learning Recurrent Neural Network (RNN) and its improved model Long Short Term Memory (LSTM) are the main models used in knowledge graph completion based on sequence FIGURE 13: Two-layer DSKG model. [106] learning. In the traditional neural network model, data is transmitted from the input layer to the hidden layer and then to the output layer.…”
Section: ) Other Knowledge Graph Completion Methods Based On Neural mentioning
confidence: 99%
See 3 more Smart Citations
“…ii)Knowledge graph completion based on sequence learning Recurrent Neural Network (RNN) and its improved model Long Short Term Memory (LSTM) are the main models used in knowledge graph completion based on sequence FIGURE 13: Two-layer DSKG model. [106] learning. In the traditional neural network model, data is transmitted from the input layer to the hidden layer and then to the output layer.…”
Section: ) Other Knowledge Graph Completion Methods Based On Neural mentioning
confidence: 99%
“…RNN has performed well in many natural language processing (NLP) tasks, such as language modeling and machine translation [65], [66]. Literature [106] based on RNN model puts forward Deep Sequential Model for Knowledge Graph Completion (DSKG) for the knowledge graph completion, and applies a specific sampling method for model training, solving the problems that RNN will meet in the application of knowledge graph data: i) Triple group is not a natural language. Short sequences converted from triples may not provide enough contextual information for prediction.…”
Section: ) Other Knowledge Graph Completion Methods Based On Neural mentioning
confidence: 99%
See 2 more Smart Citations