Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2016
DOI: 10.18653/v1/n16-1065
|View full text |Cite
|
Sign up to set email alerts
|

Combining Recurrent and Convolutional Neural Networks for Relation Classification

Abstract: This paper investigates two different neural architectures for the task of relation classification: convolutional neural networks and recurrent neural networks. For both models, we demonstrate the effect of different architectural choices. We present a new context representation for convolutional neural networks for relation classification (extended middle context). Furthermore, we propose connectionist bi-directional recurrent neural networks and introduce ranking loss for their optimization. Finally, we show… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
91
0
1

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 139 publications
(94 citation statements)
references
References 5 publications
0
91
0
1
Order By: Relevance
“…Many deep learning models have been proposed for relation extraction, with a focus on end-to-end training using CNNs (Zeng et al, 2014;Nguyen and Grishman, 2015) and RNNs (Zhang et al, 2015). Other popular approaches include using CNN or RNN over dependency paths between entities (Xu et al, 2015a,b), augmenting RNNs with different components Zhou et al, 2016), and combining RNNs and CNNs (Vu et al, 2016;Wang et al, 2016). compares the performance of CNN models against traditional approaches on slot filling using a portion of the TAC KBP evaluation data.…”
Section: Related Workmentioning
confidence: 99%
“…Many deep learning models have been proposed for relation extraction, with a focus on end-to-end training using CNNs (Zeng et al, 2014;Nguyen and Grishman, 2015) and RNNs (Zhang et al, 2015). Other popular approaches include using CNN or RNN over dependency paths between entities (Xu et al, 2015a,b), augmenting RNNs with different components Zhou et al, 2016), and combining RNNs and CNNs (Vu et al, 2016;Wang et al, 2016). compares the performance of CNN models against traditional approaches on slot filling using a portion of the TAC KBP evaluation data.…”
Section: Related Workmentioning
confidence: 99%
“…Neural network classifiers are popular for relation extraction recently. Many of them focus on fully supervised settings, recurrent neural networks (RNN) and convolutional neural networks (CNN) (Vu et al, 2016;Zeng et al, 2015;Xu et al, 2015a;Xu et al, 2015b;Zhang and Wang, 2015), sequence models and tree models are investigated (Li et al, 2015;dos Santos et al, 2015). One similar network structure to our model is proposed in (Miwa and Bansal, 2016).…”
Section: Related Workmentioning
confidence: 99%
“…Patterns in larger text fragments can be encoded and exploited by recurrent (RNNs) or convolutional neural networks (CNNs) which have been successfully used for various sentence-based classification tasks, e.g. sentiment (Kim, 2014) or relation classification (Vu et al, 2016;Tai et al, 2015).…”
Section: Introductionmentioning
confidence: 99%