2019
DOI: 10.48550/arxiv.1911.00104
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Towards calibrated and scalable uncertainty representations for neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…Deep neural networks are deterministic in nature, which is not always desirable, especially in the case of high-risk systems such as clinical systems [85,86,87]. It is desirable to have a system that can provide associated confidence with its prediction.…”
Section: Monte Carlo (Mc) Dropoutmentioning
confidence: 99%
“…Deep neural networks are deterministic in nature, which is not always desirable, especially in the case of high-risk systems such as clinical systems [85,86,87]. It is desirable to have a system that can provide associated confidence with its prediction.…”
Section: Monte Carlo (Mc) Dropoutmentioning
confidence: 99%
“…Benchmarks are an important tool to help researchers prioritize the right approaches and to inform practitioners which methods are suited for their applications [5]. There is a growing demand for benchmarking in BDL, since methods must be scored both for task performance and uncertainty quality [6,7]. Rigorously evaluating the latter is considerably more difficult, since depending on the problem setting no direct uncertainty ground-truth exists, requiring a well-defined experimental setup [8].…”
Section: A Related Workmentioning
confidence: 99%
“…Uncertainty correction (EpiCC) [59] Variational Inference (Stochastic Variational Inference) Initial adaptation [60], Bayes by Backprop [61], Reparameterization Trick [62] Normalizing flows [63][64] (Monte Carlo Dropout) MC-Dropout [65], MC-DropConnect [54], Heteroscedastic classification NN [14] Figure 11) Uncertainty Quantification techniques for deep learning models Sampling Methods (Markov Chain Monte Carlo) Hamiltonian Monte Carlo [66], Stochastic Gradient Monte Carlo (SG-MCMC) [67], RECAST (SG-MCMC method) [68] Laplace Approx.…”
Section: Standard Bnnsmentioning
confidence: 99%