2021
DOI: 10.1016/j.knosys.2021.107369
|View full text |Cite
|
Sign up to set email alerts
|

Learning hyperbolic attention-based embeddings for link prediction in knowledge graphs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(14 citation statements)
references
References 23 publications
1
13
0
Order By: Relevance
“…For example, TransE encodes entities and relations to linear space and TransR [22] expands the single linear space to a set of relation-speciic linear spaces. he embedding space can also be complex space [39], multi-dimensional Gaussian distribution space [43], hyperbolic space [49] and manifold space [16]. RotatE [37] maps the representation vectors to complex space and utilizes rotation operation of complex space to describe relation information between entities.…”
Section: Knowledge Graph Embeddingmentioning
confidence: 99%
“…For example, TransE encodes entities and relations to linear space and TransR [22] expands the single linear space to a set of relation-speciic linear spaces. he embedding space can also be complex space [39], multi-dimensional Gaussian distribution space [43], hyperbolic space [49] and manifold space [16]. RotatE [37] maps the representation vectors to complex space and utilizes rotation operation of complex space to describe relation information between entities.…”
Section: Knowledge Graph Embeddingmentioning
confidence: 99%
“…HGCN [25] further combines hyperbolic representation learning with the message passing network, projecting the node representations from the hyperbolic space to the tangent space for message aggregation. Based on the success of the above studies, [26,27,28,29] find that the hyperbolic space can well fit the hierarchical and logical patterns in KGs, and thus achieve a series of competitive performances through the high-fidelity and concise representation learning. Similarly, user-item graphs also exhibit the characteristics of a hierarchical and power-law distribution, and [30,31] therefore demonstrate the significance of combining hyperbolic geometry with recommendation tasks.…”
Section: Knowledge-enhanced Collaborative Filteringmentioning
confidence: 99%
“…COMPGCN (Vashishth et al, 2019) introduces the entity-relation composition to leverage the combination of relations and nodes during the message transmission. Zeb et al (2021) propose an encoder-decoder framework based on the relation hyperbolic graph neural network (HyperGEL), which proved effective. Similarly, Zhang et al (2022) utilize the graph attention network enhanced with association rules to improve the performance of knowledge inference.…”
Section: Related Work 21 Structure-based Knowledge Graph Completionmentioning
confidence: 99%
“…Compared with other related studies of knowledge graph completion, the proposed model has the following advantages. First, compared with methods based on network structure such as PairRE (Chao et al, 2021), DMACM (Huang et al, 2021), RNNLogic (Qu et al, 2021) and HyperGEL (Zeb et al, 2021), the EDA-KGC model has significantly better performance due to the external information. Second, compared with methods that leverage entity description, such as DKRL (Xie et al, 2016b) and EDGE (Zhou et al, 2019), the proposed model uses the comprehensive integration of entity description and network structure to capture the potential semantic information.…”
Section: Fb15k-237 Mrrmentioning
confidence: 99%