2018
DOI: 10.1016/j.jbi.2018.07.025
|View full text |Cite
|
Sign up to set email alerts
|

A convolutional route to abbreviation disambiguation in clinical text

Abstract: The neural network models work well in disambiguating abbreviations in clinical narratives, and they are robust across datasets. This avoids feature-engineering for each dataset. Coupled with an enhanced auto-training data generation, neural networks can simplify development of a practical abbreviation disambiguation system.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
37
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 34 publications
(39 citation statements)
references
References 11 publications
0
37
0
Order By: Relevance
“…To address some of the findings from the error analysis, we plan to leverage our clinical abbreviation expansion components [ 86 ] to help resolve ambiguous mentions and also incorporate assertion recognition [ 26 ] to capture the belief state of the physician on a concept (negated, hypothetical, conditional).…”
Section: Discussionmentioning
confidence: 99%
“…To address some of the findings from the error analysis, we plan to leverage our clinical abbreviation expansion components [ 86 ] to help resolve ambiguous mentions and also incorporate assertion recognition [ 26 ] to capture the belief state of the physician on a concept (negated, hypothetical, conditional).…”
Section: Discussionmentioning
confidence: 99%
“…As a matter of fact, a challenging aspect of NLP in medicine is the disambiguation of abbreviations. Joopudi et al (40) trained a Convolutional Neural Network (CNN) to disambiguate abbreviation senses. For example, mg could have two senses: myasthenia gravis and milligrams.…”
Section: Nlp Applications In Clinical Contextmentioning
confidence: 99%
“…In contrast to the techniques employed in the abovementioned studies, deep learning methods [25,26] have the obvious advantage that feature engineering can be avoided. In [26], Ahmed et al suggested a deep learning model to train a vector for the context of each sense of each abbreviation.…”
Section: Asdmentioning
confidence: 99%
“…For disambiguation, the cosine-similarities for the context vector of the sentence with the context vector of each sense are calculated, and the sense with the maximum value is selected. In [25], Joopudi et al proposed a convolutional neural network with one convolutional kernel, a max-pooling layer, and a fully connected feed-forward neural network layer followed by a fully connected softmax classifier. Except for the embedding of surrounding words, the location and part of speech information of the word are considered in the context.…”
Section: Asdmentioning
confidence: 99%