Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1024
|View full text |Cite
|
Sign up to set email alerts
|

Attention Guided Graph Convolutional Networks for Relation Extraction

Abstract: Dependency trees convey rich structural information that is proven useful for extracting relations among entities in text. However, how to effectively make use of relevant information while ignoring irrelevant information from the dependency trees remains a challenging research question. Existing approaches employing rule based hard-pruning strategies for selecting relevant partial dependency structures may not always yield optimal results. In this work, we propose Attention Guided Graph Convolutional Networks… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
285
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 386 publications
(285 citation statements)
references
References 30 publications
0
285
0
Order By: Relevance
“…exploited the shortest dependency path between two entities and the sub-trees attached to that path (augmented dependency path) for relation extraction. Zhang et al (2018) and Guo et al (2019) used graph convolution networks with pruned dependency tree structures for this task. In this work, we have incorporated the dependency distance of the words in a sentence from the two entities in a multi-factor attention mechanism to improve sentence-level relation extraction.…”
Section: Related Workmentioning
confidence: 99%
“…exploited the shortest dependency path between two entities and the sub-trees attached to that path (augmented dependency path) for relation extraction. Zhang et al (2018) and Guo et al (2019) used graph convolution networks with pruned dependency tree structures for this task. In this work, we have incorporated the dependency distance of the words in a sentence from the two entities in a multi-factor attention mechanism to improve sentence-level relation extraction.…”
Section: Related Workmentioning
confidence: 99%
“…We employ the attention-guided graph convolutional neural network (AGCNN) (Guo et al, 2019a) to incorporate the dependency information into word representations, which is composed of M identical blocks. Each block has three types of layers: attention-guided layer, densely connected layer, linear combination layer.…”
Section: Attention-guided Gcnn Layermentioning
confidence: 99%
“…Graph-based models explore the local and global topological structure of the user-item graph combined with other attributes, and aim to learn efficient low-dimensional representations for each user and item. Graph Convolutional Networks(GCNs) [7] and its variants [4,[19][20][21] extend deep learning algorithms to graph-structured data by defining convolution operators on graphs, and have proven powerful when dealing with various downstream tasks [3,13,17,22], including learning low-dimensional embeddings of users and items in a recommender system [19,21,26]. However, such models struggle to capture higher-order connectivity patterns among nodes, as they only aggregate information from direct neighboring nodes (or firstorder neighbors), though it could be beneficial to take high-order connectivity into account [8,15].…”
Section: Introductionmentioning
confidence: 99%