2018
DOI: 10.1093/database/bay102
|View full text |Cite
|
Sign up to set email alerts
|

Extracting chemical–protein relations using attention-based neural networks

Abstract: Relation extraction is an important task in the field of natural language processing. In this paper, we describe our approach for the BioCreative VI Task 5: text mining chemical–protein interactions. We investigate multiple deep neural network (DNN) models, including convolutional neural networks, recurrent neural networks (RNNs) and attention-based (ATT-) RNNs (ATT-RNNs) to extract chemical–protein relations. Our experimental results indicate that ATT-RNN models outperform the same models without using attent… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(24 citation statements)
references
References 42 publications
(36 reference statements)
0
24
0
Order By: Relevance
“…In the pooling layer, the attention pooling (22) is applied to map the multihead attention matrix to the sentence vector representation. The ‘softmax’ function is used in the output layer to implement the detection and classification of CPIs.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In the pooling layer, the attention pooling (22) is applied to map the multihead attention matrix to the sentence vector representation. The ‘softmax’ function is used in the output layer to implement the detection and classification of CPIs.…”
Section: Methodsmentioning
confidence: 99%
“…We introduce a deep neural model to extract CPIs from the literature, which includes an ELMo input layer, bidirectional long short-term memory networks (Bi-LSTMs) and a multihead attention layer. Liu et al (22) integrated attention pooling into the gated recurrent unit (GRU) model to extract CPIs. Verga et al (23) combined the multihead attention with CNNs to construct transformer model to extract the document-level biomedical relations.…”
Section: Introductionmentioning
confidence: 99%
“…Efforts such as ChemProt (Krallinger et al, 2017) aim to promote research in this direction by presenting PubMed abstracts in which proteins, chemicals, and their relations are manually labeled. Liu et al, 2018 andLim andKang, 2018 developed sequential models for chemical and protein relation extraction on ChemProt, while Peng et al, 2018 used an ensemble of deep models with SVM. However, all these models depend on the named entities to be recognized beforehand.…”
Section: Related Workmentioning
confidence: 99%
“…Deep learning has been increasingly popular as these methods can outperform common machine learning methods [49]. Approaches in this field consist of using various neural network architectures, such as recurrent neural networks [50,51,52,53,54,55] and convolutional neural networks [51,54,56,57,58], to extract relationships from text. In fact approaches in this field were the winning model within the BioCreative VI shared task [41,59].…”
Section: Supervised Extractorsmentioning
confidence: 99%