2019
DOI: 10.1093/jamia/ocz200
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning in clinical natural language processing: a methodical review

Abstract: Objective This article methodically reviews the literature on deep learning (DL) for natural language processing (NLP) in the clinical domain, providing quantitative analysis to answer 3 research questions concerning methods, scope, and context of current research. Materials and Methods We searched MEDLINE, EMBASE, Scopus, the Association for Computing Machinery Digital Library, and the Association for Computational Linguisti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
192
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 313 publications
(206 citation statements)
references
References 44 publications
0
192
0
1
Order By: Relevance
“…This investigation is part of a wider literature that employs deep learning in clinical NLP [31]. In this study, we employ a specific RNN structure that had been previously and successfully used in combination with GloVe embeddings [16].…”
Section: Plos Onementioning
confidence: 99%
“…This investigation is part of a wider literature that employs deep learning in clinical NLP [31]. In this study, we employ a specific RNN structure that had been previously and successfully used in combination with GloVe embeddings [16].…”
Section: Plos Onementioning
confidence: 99%
“…Most of the criticisms of EHRs in recent years have focused on their role in physician burnout (146,147). The technology perspective on EHR data, fueled by methodological advances like deep learning (23) and the near-continuous development of high-performing predictive and diagnostic algorithms (6,Table 3), has primarily been one of excitement.…”
Section: Resultsmentioning
confidence: 99%
“…Until the last few years, embeddings consisted of one vector per entity; that is, one vector per word, phrase, or document. However, novel neural network architectures (23) have permitted the creation of embeddings that vary depending on the context; this has expanded the representational power of embedding methods and led to the creation of massive pretrained language models like BERT (Bidirectional Encoder Representations from Transformers) (30). These models are generally too resource-intensive to be trained from scratch.…”
Section: Contextual Embeddings and Pretrainingmentioning
confidence: 99%
See 1 more Smart Citation
“…Some research communities began to launch competitions on machine reading comprehension in the clinical field, such as MEDIQA 2019 [4], BIOASQ [5], etc. These tasks have attracted some researchers to carry out various researches, and have played dramatic roles in promoting researches in the clinical medical field [6]. And some related data sets have been proposed, such as CliCR [7], PubMedQA [8], Chimed [2] and emrQA [3] etc.…”
Section: Introductionmentioning
confidence: 99%