Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 2021
DOI: 10.18653/v1/2021.findings-acl.115
|View full text |Cite
|
Sign up to set email alerts
|

Incorporating Global Information in Local Attention for Knowledge Representation Learning

Abstract: Graph Attention Networks (GATs) have proven a promising model that takes advantage of localized attention mechanism to perform knowledge representation learning (KRL) on graph-structure data, e.g., Knowledge Graphs (KGs). While such approaches model entities' local pairwise importance, they lack the capability to model global importance relative to other entities of KGs. This causes such models to miss critical information in tasks where global information is also a significant component for the task, such as … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 59 publications
(56 reference statements)
0
5
0
Order By: Relevance
“…In addition, we will try to use better methods to utilize the knowledge graph structure, such as other knowledge graph embedding methods. As introduced in Section 2.3, some recent knowledge graph embedding methods such as HAKE [44], PairRE [45], DualE [46], and EIGAT [47] can better encode entities and relations in knowledge graphs, and theoretically they should further improve the performance of entity linking.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, we will try to use better methods to utilize the knowledge graph structure, such as other knowledge graph embedding methods. As introduced in Section 2.3, some recent knowledge graph embedding methods such as HAKE [44], PairRE [45], DualE [46], and EIGAT [47] can better encode entities and relations in knowledge graphs, and theoretically they should further improve the performance of entity linking.…”
Section: Discussionmentioning
confidence: 99%
“…DualE universally models relations as the combination of a series of translation and rotation operations. EIGAT [47] allows correct incorporation of global information into the graph attention network (GAT) family of models by using scaled entity importance, which is computed by an attention-based global random walk algorithm. In order to focus on the importance of the knowledge graph structure for the entity linking task, the knowledge graph embedding method used in this article is the most basic TransE model.…”
Section: Knowledge Graph Embeddingmentioning
confidence: 99%
“…Zhang, Zhuang et al [4] transforms it into relation-level attention and entity-level attention. Zhao, Zhou et al [17] adds additional global attention mechanism. Fang, Wang et al [18] consider the impact of neighborhoods on representation learning of relations.…”
Section: Related Work a Link Predictionmentioning
confidence: 99%
“…Due to the graph structure of KG, it is natural to introduce graph convolutional networks (GCN) into link prediction. In addition, the emergence of graph attention networks (GAT) [3]- [5], which incorporate attention mechanisms into GCNs, achieve great improvement.…”
Section: Introductionmentioning
confidence: 99%
“…Previous work generally uses one-hot encoding to represent entity knowledge, but using graph neural networks to encode structural knowledge and attribute entity representation for fine-grained modeling of entities, vocabulary, etc., allows the introduction of multi-angle feature knowledge [19]. Knowledge representation based on graph networks can flexibly use local, spatial, and other structural information, and can effectively address the polysemy of words [20]. Therefore, using graph networks to implement knowledge representation has become a hot research topic in recent years such as Ebisu, Balazevic Zhang, and other related scholars used graphs to accomplish different tasks respectively [21] [22] [23].…”
Section: Related Work a Graph Neural Network And Knowledge Representa...mentioning
confidence: 99%