2019
DOI: 10.1007/978-3-030-15712-8_47
|View full text |Cite
|
Sign up to set email alerts
|

End-to-End Neural Relation Extraction Using Deep Biaffine Attention

Abstract: We propose a neural network model for joint extraction of named entities and relations between them, without any hand-crafted features. The key contribution of our model is to extend a BiLSTM-CRF-based entity recognition model with a deep biaffine attention layer to model second-order interactions between latent features for relation classification, specifically attending to the role of an entity in a directional relationship. On the benchmark "relation and entity recognition" dataset CoNLL04, experimental res… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
15
0
1

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 48 publications
(24 citation statements)
references
References 25 publications
(43 reference statements)
0
15
0
1
Order By: Relevance
“…Other entities in the example document are omitted for clarity. at once (Bekoulis et al, 2018;Nguyen and Verspoor, 2019;. This does not only improve simplicity and efficiency, but is also commonly motivated by the fact that tasks can benefit from each other: For example, knowledge of two entities' types (such as person+organization) can boost certain relations between them (such as ceo of).…”
Section: Introductionmentioning
confidence: 99%
“…Other entities in the example document are omitted for clarity. at once (Bekoulis et al, 2018;Nguyen and Verspoor, 2019;. This does not only improve simplicity and efficiency, but is also commonly motivated by the fact that tasks can benefit from each other: For example, knowledge of two entities' types (such as person+organization) can boost certain relations between them (such as ceo of).…”
Section: Introductionmentioning
confidence: 99%
“…Based on the architecture of [10], Refs. [42,43] incorporated different types of attention, but still based on a sequence labeling scheme. Ref.…”
Section: Joint Entity and Relation Extraction Modelmentioning
confidence: 99%
“…Early approaches commonly used conditional random fields (CRF) on parse trees, representing the grammatical structure and dependencies in a sentence. Nguyen et al [48] combine modern neural BiLSTM architectures with CRFs for an end-to-end trained model to improve performance. Based on the assumption that if two entities are mentioned in the same text segment, Soares et al [73] use BERT [16] to learn relationship embeddings.…”
Section: Relationship Extraction (Relex)mentioning
confidence: 99%