2022
DOI: 10.1016/j.knosys.2022.109889
|View full text |Cite
|
Sign up to set email alerts
|

A knowledge graph completion model based on contrastive learning and relation enhancement method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(4 citation statements)
references
References 34 publications
0
3
0
Order By: Relevance
“…Subsequently, representation learning models based on GNNs and GATs have been proposed. For instance, R-GCN [31] and ComplexGCN [32] handle node relations based on convolution, while KBGAT [33] and MRGAT [34] respectively consider attention mechanisms and heterogeneous multiple relation connections when aggregating neighboring information. While deep learning-based models demonstrate excellent predictive capabilities between unknown entities and relations, they come with high computational costs and training difficulties.…”
Section: Deep Learning-based Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…Subsequently, representation learning models based on GNNs and GATs have been proposed. For instance, R-GCN [31] and ComplexGCN [32] handle node relations based on convolution, while KBGAT [33] and MRGAT [34] respectively consider attention mechanisms and heterogeneous multiple relation connections when aggregating neighboring information. While deep learning-based models demonstrate excellent predictive capabilities between unknown entities and relations, they come with high computational costs and training difficulties.…”
Section: Deep Learning-based Modelsmentioning
confidence: 99%
“…These models include ATTH [38], HyperGEL [39], MuRP [36], and Hybonet [42]. Additionally, we will compare them with state-of-the-art and representative Euclidean and complex space embedding models that have proposed knowledge graph reasoning methods, including TransE [21], DistMult [26], MuRE [36], TuckER [28], ConvE [29], ConvKB [30], and KBGAT [33] for Euclidean space, and ComplEx [27], RotatE [23], and ComplexGCN [32] for complex space. Furthermore, we will consider graph neural network based models and recent models with outstanding predictive performance, such as R-GCN [31], MRGAT [34], and GTKGC [43].…”
Section: Baselinesmentioning
confidence: 99%
“…Moreover, we did not incorporate multi-features such as phonetics information (another type of sequence information) [41] or graph technology to extract latent entity relationships containing auxiliary knowledge [45]. Moreover, a foreseeable challenge in integrating multifeatures into Word-unit BLS is the substantial increase in training time.…”
Section: B Limitationsmentioning
confidence: 99%
“…For translation models and bilinear models, the models are relatively simple, and it is difficult to fully explore the relationship between triple entities. Neural network models and rotation models often need to spend more memory space to express the embeddings of entities to obtain more semantic information [12][13][14]. In order to reduce the demand for memory space, in recent years, hyperbolic models have gradually attracted attention in hyperbolic space [15,16].…”
Section: Introductionmentioning
confidence: 99%