2018
DOI: 10.1109/access.2018.2882443
|View full text |Cite
|
Sign up to set email alerts
|

An Attentive Neural Sequence Labeling Model for Adverse Drug Reactions Mentions Extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(21 citation statements)
references
References 27 publications
0
21
0
Order By: Relevance
“…We conduct experiments to examine the effectiveness of CNN-NEAT’s Attention in locating the text segments containing the correct side effects. We benchmark our results against a lexicon-based tagging using UMLS thesaurus for medical terms [ 52 ] and a state-of-the-art neural side effect extractor [ 21 ] which was supervisedly trained to identify side effect mentions in social media contents. In this task, correctly predicting the positive side effects is of the utmost importance, hence, we benchmark the text segments extracted from CNN-NEAT’s Attention against two mentioned baselines on precision metric.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…We conduct experiments to examine the effectiveness of CNN-NEAT’s Attention in locating the text segments containing the correct side effects. We benchmark our results against a lexicon-based tagging using UMLS thesaurus for medical terms [ 52 ] and a state-of-the-art neural side effect extractor [ 21 ] which was supervisedly trained to identify side effect mentions in social media contents. In this task, correctly predicting the positive side effects is of the utmost importance, hence, we benchmark the text segments extracted from CNN-NEAT’s Attention against two mentioned baselines on precision metric.…”
Section: Resultsmentioning
confidence: 99%
“…By learning to focus on essential text segments, attention allows text encoders to capture long term semantic dependencies with regard to auxiliary contextual information [ 41 , 42 ]. In our related task of ADR mentions extraction, attention has been adopted recently in neural sequence labelling models [ 21 , 43 ], resulting in promising improvement. Inspired by the concept, we enhance text encoding with user expertise attention.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Zheng [37] proposed the Opentag model, using BiLSTM to capture context information and CRF to complete sequence labeling, which is a new attention mechanism to provide interpretation for model decision-making. Similarly, the sequence labeling task is also widely used in the medical field [6], [11], [29], [32].…”
Section: Related Workmentioning
confidence: 99%
“…TF-IDF) and syntactic features. Recently, deep learning has been widely used in natural language processing (NLP) [12]- [15], and it brings hope to reduce manual feature engineering in various tasks. Compare with hand-designed features and traditional discrete feature representation, it provides a different way to automatically learn dense features representation for text, such as words, phrases and sentences.…”
Section: Introductionmentioning
confidence: 99%