2022
DOI: 10.48550/arxiv.2205.00403
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Simple Approach to Improve Single-Model Deep Uncertainty via Distance-Awareness

Abstract: Accurate uncertainty quantification is a major challenge in deep learning, as neural networks can make overconfident errors and assign high confidence predictions to out-of-distribution (OOD) inputs. The most popular approaches to estimate predictive uncertainty in deep learning are methods that combine predictions from multiple neural networks, such as Bayesian neural networks (BNNs) and deep ensembles. However their practicality in real-time, industrial-scale applications are limited due to the high memory a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 55 publications
0
2
0
Order By: Relevance
“…Orthogonally, several (less scalable) works advocated for leveraging the compositional (perhaps causal [44]) structure in the underlying data-generative process to introduce suitable inductive biases [45][46][47][48][49][50][51]. (4) Simultaneously, Bayesian approaches for uncertainty predictions have been proposed to improve model calibration [52][53][54][55][56][57][58][59][60] and robustness on new distributions [61][62][63]. Recent work, however, found that larger models were natively better calibrated [64].…”
Section: Introductionmentioning
confidence: 99%
“…Orthogonally, several (less scalable) works advocated for leveraging the compositional (perhaps causal [44]) structure in the underlying data-generative process to introduce suitable inductive biases [45][46][47][48][49][50][51]. (4) Simultaneously, Bayesian approaches for uncertainty predictions have been proposed to improve model calibration [52][53][54][55][56][57][58][59][60] and robustness on new distributions [61][62][63]. Recent work, however, found that larger models were natively better calibrated [64].…”
Section: Introductionmentioning
confidence: 99%
“…We refer the interested reader to [24]. Note that the SNGP approach proved to effectively increase the model's predictive uncertainty quality in an OOD detection setting for remote sensing image classification [19].…”
Section: Output Mapping Distance Awarenessmentioning
confidence: 99%