2019
DOI: 10.2196/14830
|View full text |Cite
|
Sign up to set email alerts
|

Fine-Tuning Bidirectional Encoder Representations From Transformers (BERT)–Based Models on Large-Scale Electronic Health Record Notes: An Empirical Study

Abstract: Background The bidirectional encoder representations from transformers (BERT) model has achieved great success in many natural language processing (NLP) tasks, such as named entity recognition and question answering. However, little prior work has explored this model to be used for an important task in the biomedical and clinical domains, namely entity normalization. Objective We aim to investigate the effectiveness of BERT-based models for biomedical o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
93
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 135 publications
(95 citation statements)
references
References 25 publications
2
93
0
Order By: Relevance
“…Several NLP studies on electronic health records have attempted to create models that accomplish multiple tasks based on an advanced deep learning approach. Li et al developed a BERT-based model for electronic health records (EhrBERT) to normalize biomedical entities on the standard vocabulary 11 . In comparison, our model could extract pathological keywords stated in the original text and intuitively summarize narrative reports while preserving the intention of the vocabulary.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Several NLP studies on electronic health records have attempted to create models that accomplish multiple tasks based on an advanced deep learning approach. Li et al developed a BERT-based model for electronic health records (EhrBERT) to normalize biomedical entities on the standard vocabulary 11 . In comparison, our model could extract pathological keywords stated in the original text and intuitively summarize narrative reports while preserving the intention of the vocabulary.…”
Section: Discussionmentioning
confidence: 99%
“…The bidirectional encoder representations from transformers (BERT) model is one of the latest deep learning language models based on attention mechanisms 10 . It has been applied in many kinds of biomedical natural language processing (NLP) research, including clinical entity normalization, text mining (i.e., BioBERT), breast cancer concept extraction, and discharge summary diagnosis extraction [11][12][13][14] .…”
mentioning
confidence: 99%
“…BERT set new performance records across nearly all natural language processing benchmarks and, for the first time, achieved human‐level performance on several of these. BERT and related technologies are being applied to unstructured biomedical 215,216 and clinical texts 217,218 . Translation of these new technologies into the clinical domain is still in the early stages, but progress is accelerating.…”
Section: Data Mining Natural Language Processing and Ai Using Big Dmentioning
confidence: 99%
“…They produced two models, one for generic clinical text and another for discharge summaries, which they released publicly (35). They and others have demonstrated that BERT models fine-tuned on clinical corpora improve the state of the art on clinical concept recognition, de-identification, inference, and concept normalization tasks (88,89), though in at least one case, UMLS features still contributed valuable additional information (90).…”
Section: Contextual Embeddings and Pretrainingmentioning
confidence: 99%