2021
DOI: 10.2196/27527
|View full text |Cite
|
Sign up to set email alerts
|

Relation Classification for Bleeding Events From Electronic Health Records Using Deep Learning Systems: An Empirical Study

Abstract: Background Accurate detection of bleeding events from electronic health records (EHRs) is crucial for identifying and characterizing different common and serious medical problems. To extract such information from EHRs, it is essential to identify the relations between bleeding events and related clinical entities (eg, bleeding anatomic sites and lab tests). With the advent of natural language processing (NLP) and deep learning (DL)-based techniques, many studies have focused on their applicability … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 14 publications
(13 citation statements)
references
References 49 publications
0
10
0
Order By: Relevance
“…Recently, multiple deep learning approaches to NLP have used Bidirectional Encoder Representations from Transformers[ 39 ] (BERT)[ 40 ] with self-supervised pre-training on PubMed and PubMed Central and fine-tuned them in a fully-supervised manner to achieve state-of-the-art performance on several biomedical named entity recognition (NER) [ 41 , 42 ] and entity normalization tasks [ 43 ]. In the related clinical domain, pre-training on clinical notes [ 44 – 46 ] and fine-tuning on electronic health records [ 47 ] have been demonstrated to identify semantically similar sentences for note summarization [ 48 ], classify relations between bleeding events and clinical entities for better detection of bleeding [ 49 ], perform clinical entity normalization [ 47 ], and predict diseases [ 50 ]. Based on the bidirectional transformer’s ability to transfer deep contextual learning and its recent success on a wide variety of NLP tasks, we hypothesize that a BioBERT-based model will be effective for EI extraction [ 41 ], particularly given that rare diseases have less training data available.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, multiple deep learning approaches to NLP have used Bidirectional Encoder Representations from Transformers[ 39 ] (BERT)[ 40 ] with self-supervised pre-training on PubMed and PubMed Central and fine-tuned them in a fully-supervised manner to achieve state-of-the-art performance on several biomedical named entity recognition (NER) [ 41 , 42 ] and entity normalization tasks [ 43 ]. In the related clinical domain, pre-training on clinical notes [ 44 – 46 ] and fine-tuning on electronic health records [ 47 ] have been demonstrated to identify semantically similar sentences for note summarization [ 48 ], classify relations between bleeding events and clinical entities for better detection of bleeding [ 49 ], perform clinical entity normalization [ 47 ], and predict diseases [ 50 ]. Based on the bidirectional transformer’s ability to transfer deep contextual learning and its recent success on a wide variety of NLP tasks, we hypothesize that a BioBERT-based model will be effective for EI extraction [ 41 ], particularly given that rare diseases have less training data available.…”
Section: Introductionmentioning
confidence: 99%
“…Mobile devices are also a good option for deploying deep learning and reinforcement learning models. Deep learning has shown to improve a broad range of clinical applications such as detection of drug discontinuation events, improved phenotyping, making diagnoses based on clinical images, and prediction of clinical outcomes 46 , 67 74 . In reinforcement learning, the computer uses real-time trial and error in an interactive environment to maximize the total cumulative reward (e.g., maximize the probability of obtaining the intended outcome).…”
Section: Digital Non-medical Datamentioning
confidence: 99%
“…The current leading NLP models such as BERT [20], GPT [21], and T5 [22] announced later are all based on this transformer block. In particular, BERT is commonly used in biomedical text mining research because it is built on multiple transformers encoder blocks, which has the advantage of compressing the sentence and mining semantic information from it [8,[23][24][25].…”
Section: Deep Learning-based Semantic Relation Classification Modelmentioning
confidence: 99%
“…BERT is also being utilized in the medical and clinical fields to automatically analyze various medical data such as electronic health records [ 24 , 28 ]. The need for a systematic review is emerging for evidence-based diagnosis and treatment in the medical field.…”
Section: Introductionmentioning
confidence: 99%