2017
DOI: 10.1186/s12859-017-1868-5
|View full text |Cite
|
Sign up to set email alerts
|

Long short-term memory RNN for biomedical named entity recognition

Abstract: BackgroundBiomedical named entity recognition(BNER) is a crucial initial step of information extraction in biomedical domain. The task is typically modeled as a sequence labeling problem. Various machine learning algorithms, such as Conditional Random Fields (CRFs), have been successfully used for this task. However, these state-of-the-art BNER systems largely depend on hand-crafted features.ResultsWe present a recurrent neural network (RNN) framework based on word embeddings and character representation. On t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

2
68
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
2
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 110 publications
(70 citation statements)
references
References 27 publications
(24 reference statements)
2
68
0
Order By: Relevance
“…Lyu et al introduces a variation of recurrent neural network (RNN) called bidirectional long short-term memory units (BLSTM) to recognize biomedical-named entities. 20 Lakhani et al uses convolutional neural network (CNN) to extracting ICD-O-3 topographic codes from pathology reports. 7 Gao et al uses a hierarchical attention networks for information extraction from pathology reports.…”
Section: Introductionmentioning
confidence: 99%
“…Lyu et al introduces a variation of recurrent neural network (RNN) called bidirectional long short-term memory units (BLSTM) to recognize biomedical-named entities. 20 Lakhani et al uses convolutional neural network (CNN) to extracting ICD-O-3 topographic codes from pathology reports. 7 Gao et al uses a hierarchical attention networks for information extraction from pathology reports.…”
Section: Introductionmentioning
confidence: 99%
“…Korvigo et al [26] applied a CNN-RNN network to recognize spans of chemicals and Luo et al 2018 [28] proposed attention-based bidirectional LSTM with CRF to detect spans of chemicals. Unanue et al, 2017 [29] used bidirectional LSTM with CRF to detect spans of drug names and clinical concepts, while Lyu et al 2017 [27] proposed bidirectional LSTM-RNN model for detecting spans of a variety of biomedical concepts. However, none of these approaches also attempted the normalization step, so they did not identify which particular concept in an ontology was detected.…”
Section: Related Workmentioning
confidence: 99%
“…Features that capture word-internal characteristics have been shown to be effective for BNER tasks in CRF models (Klinger et al, 2008). Lyu et al (2017) applied a BiLSTM-CRF model with LSTM-based character-level word embeddings to a gene and protein NER task, demonstrating state-of-art performance that outperformed traditional feature-based models. Luo et al (2018) further improved on this result on a chemical NER task by adding an attention layer between the BiL-STM and CRF layers (Att-BiLSTM-CRF).…”
Section: Introductionmentioning
confidence: 99%