2023
DOI: 10.1016/j.compbiomed.2023.106619
|View full text |Cite
|
Sign up to set email alerts
|

Explainable artificial intelligence model for identifying COVID-19 gene biomarkers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
1
1

Relationship

2
8

Authors

Journals

citations
Cited by 29 publications
(12 citation statements)
references
References 43 publications
0
10
0
Order By: Relevance
“…This study further highlights the potential role of variants with similar MAFs for the use of biomarkers for COVID-19. Yagin et al (2023) recently developed artificial intelligence model (extreme gradient boosting) gene expression profiles, capable of successfully predicting COVID-19 [176]. Similarly, SNPs should be studied for their potential use as predictive biomarkers for COVID-19 severity.…”
Section: Sars-cov-2 Receptor-based Therapy and Clinical Significancementioning
confidence: 99%
“…This study further highlights the potential role of variants with similar MAFs for the use of biomarkers for COVID-19. Yagin et al (2023) recently developed artificial intelligence model (extreme gradient boosting) gene expression profiles, capable of successfully predicting COVID-19 [176]. Similarly, SNPs should be studied for their potential use as predictive biomarkers for COVID-19 severity.…”
Section: Sars-cov-2 Receptor-based Therapy and Clinical Significancementioning
confidence: 99%
“…In tenfold CV, the dataset is split into ten equal parts, with each part used once for evaluation and the rest for training, repeated ten times. This procedure, repeated 5 times with new random partitions, offers a robust assessment of the model's performance and reduces variability in performance estimates 23 25 .…”
Section: Methodsmentioning
confidence: 99%
“…Shapley Additive Explanations (SHAP) is a novel game theory-based approach in explainable ML introduced by Lundberg and Lee [ 10 ], it can well solve the issue of inexplicability by providing a solution for better understanding and interpreting complex models, and this method allows for representing the contribution of each feature to the outcome. Yagin et al proposed an explainable artificial intelligence model to predict COVID-19 using metagenomic next-generation sequencing (mNGS) data, and the model allowed physicians to enhance their comprehension of the decision-making process in COVID-19 genomic prediction [ 11 ]. Another study developed a XGBoost model combined with SHAP to effectively predict the 3-year all-cause mortality in coronary heart disease and heart failure patients.…”
Section: Introductionmentioning
confidence: 99%