Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing 2017
DOI: 10.18653/v1/d17-1182
|View full text |Cite
|
Sign up to set email alerts
|

End-to-End Neural Relation Extraction with Global Optimization

Abstract: Neural networks have shown promising results for relation extraction. State-ofthe-art models cast the task as an end-toend problem, solved incrementally using a local classifier. Yet previous work using statistical models have demonstrated that global optimization can achieve better performances compared to local classification. We build a globally optimized neural model for end-to-end relation extraction, proposing novel LSTM features in order to better learn context representations. In addition, we present a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
132
0
1

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 153 publications
(133 citation statements)
references
References 42 publications
0
132
0
1
Order By: Relevance
“…Many state-of-the-art RE models rely upon domain-specific external syntactic tools to construct dependency paths between the entities in a sentence (Li and Ji, 2014;Xu et al, 2015;Miwa and Bansal, 2016;Zhang et al, 2017). These systems suffer from cascading errors from these tools and are hard to generalize to different domains.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Many state-of-the-art RE models rely upon domain-specific external syntactic tools to construct dependency paths between the entities in a sentence (Li and Ji, 2014;Xu et al, 2015;Miwa and Bansal, 2016;Zhang et al, 2017). These systems suffer from cascading errors from these tools and are hard to generalize to different domains.…”
Section: Related Workmentioning
confidence: 99%
“…Bekoulis et al (2018) use adversarial training as regularization for a neural model. Zhang et al (2017) cast joint entity and relation extraction as a table filling problem and build a globally optimized neural model incorporating syntactic representations from a dependency parser. Similar to DYGIE, Sanh et al (2019) and Luan et al (2018a) use a multi-task learning framework for extracting entity, relation and coreference labels.…”
Section: Baselinesmentioning
confidence: 99%
“…In later studies, Li and Ji (2014) extract entity mentions and relations using structured perceptron with efficient beamsearch, which is significantly more efficient and less Time-consuming than constraint-based approaches. Miwa and Sasaki (2014); Gupta et al (2016); Zhang et al (2017) proposed the tablefilling approach, which provides an opportunity to incorporating more sophisticated features and algorithms into the model, such as search orders in decoding and global features. Neural network models have been widely used in the literature as well.…”
Section: Extracting Entities and Relationsmentioning
confidence: 99%
“…Miwa and Bansal (2016) leverage LSTM architectures to jointly predict both entity and relations, but fall short on ensuring prediction consistency. Zhang et al (2017) combine the benefits of both neural net and global optimization with beam Figure 2: Deep neural network architecture for joint structured learning. Note that on the structured learning layer, grey bars denote tokens being predicted as events.…”
Section: Related Workmentioning
confidence: 99%