2023
DOI: 10.1109/jbhi.2023.3288768
|View full text |Cite
|
Sign up to set email alerts
|

A Transformer-Based Model Trained on Large Scale Claims Data for Prediction of Severe COVID-19 Disease Progression

Abstract: In situations like the COVID-19 pandemic, healthcare systems are under enormous pressure as they can rapidly collapse under the burden of the crisis. Machine learning (ML) based risk models could lift the burden by identifying patients with a high risk of severe disease progression. Electronic Health Records (EHRs) provide crucial sources of information to develop these models because they rely on routinely collected healthcare data. However, EHR data is challenging for training ML models because it contains i… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 50 publications
0
2
0
Order By: Relevance
“…age, sex, education) or molecular data modalities commonly available in biobanks (e.g. genomics, proteomics) 37 .…”
Section: Discussionmentioning
confidence: 99%
“…age, sex, education) or molecular data modalities commonly available in biobanks (e.g. genomics, proteomics) 37 .…”
Section: Discussionmentioning
confidence: 99%
“…Specifically, transformers learn context by tracking relationships in sequential data such as words in a sentence. Typically, transformer-based models are trained in 2 phases: the pretraining phase focuses on generic representation learning, and the transfer learning phase focuses on adjusting the model to an application-specific prediction task [ 4 ]. The pretrained models, which are often trained on large data sets (eg, Wikipedia, Reddit, biomedical literature, or public medical data sets), are tuned to be used for a wider set of tasks and can be fine-tuned for specific tasks [ 5 ].…”
Section: Introductionmentioning
confidence: 99%