2021
DOI: 10.21203/rs.3.rs-890026/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Objective Evaluation of Deep Uncertainty Predictions for COVID-19 Detection

Abstract: Deep neural networks (DNNs) have been widely applied for detecting COVID-19 in medical images. Existing studies mainly apply transfer learning and other data representation strategies to generate accurate point estimates. The generalization power of these networks is always questionable due to being developed using small datasets and failing to report their predictive confidence. Quantifying uncertainties associated with DNN predictions is a prerequisite for their trusted deployment in medical settings. Here w… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 24 publications
0
10
0
Order By: Relevance
“…Bayes' Theorem utilises to compute the posterior probability for the given classes. Equation 1 has been used to calculate the posterior probability [25]:…”
Section: Decision Maker Modulementioning
confidence: 99%
See 1 more Smart Citation
“…Bayes' Theorem utilises to compute the posterior probability for the given classes. Equation 1 has been used to calculate the posterior probability [25]:…”
Section: Decision Maker Modulementioning
confidence: 99%
“…Where P(c|x) indicates the probability of the predictor group c (x, features), P(c) represents the preceding probability of the class; P(x|c) represents the probability of a feature in the current class. P(x) defines the likelihood of a feature preceding it [25].…”
Section: Decision Maker Modulementioning
confidence: 99%
“…Normally, model predictions are grouped into two categories: correct and incorrect. For uncertainty estimation, we have another two groups: certain and uncertain predictions [33]. We also need to have some threshold values to categorize predictions into certain and uncertain.…”
Section: Uncertainty Evaluation Metricsmentioning
confidence: 99%
“…Confusion matrix is an effective tool to estimate the performance of a model in classification problems where accuracy alone can be a misleading indicator due to the uneven distributions of classes. Performance metrics similar to regular confusion matrix metrics are also defined for uncertainty estimation as below [33]:…”
Section: Uncertainty Evaluation Metricsmentioning
confidence: 99%
“…Reference [55] calls for uncertainty evaluation for predictions. This is related to lack of statistical analysis of the prediction quality mentioned in both references [3] and [4].…”
Section: The Analysis Of the Reviews Done To The Previous Covid-19 Di...mentioning
confidence: 99%