2022
DOI: 10.48550/arxiv.2201.13279
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

UQGAN: A Unified Model for Uncertainty Quantification of Deep Classifiers trained via Conditional GANs

Abstract: We present an approach to quantifying both aleatoric and epistemic uncertainty for deep neural networks in image classification, based on generative adversarial networks (GANs). While most works in the literature that use GANs to generate out-of-distribution (OoD) examples only focus on the evaluation of OoD detection, we present a GAN based approach to learn a classifier that exhibits proper uncertainties for OoD examples as well as for false positives (FPs). Instead of shielding the entire in-distribution da… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…the flow model is integrated into GAN to enable likelihood estimation and better uncertainty quantification. Another approach [39,99] further extends the method by utilizing the Wasserstein GAN [4] based on gradient penalty to improve model convergence. The key advantage of deep generative modeling for uncertainty quantification is that they directly parameterize a deep neural network to represent the prediction distribution, without the need to have an explicit distribution format.…”
Section: Deep Generativementioning
confidence: 99%
See 1 more Smart Citation
“…the flow model is integrated into GAN to enable likelihood estimation and better uncertainty quantification. Another approach [39,99] further extends the method by utilizing the Wasserstein GAN [4] based on gradient penalty to improve model convergence. The key advantage of deep generative modeling for uncertainty quantification is that they directly parameterize a deep neural network to represent the prediction distribution, without the need to have an explicit distribution format.…”
Section: Deep Generativementioning
confidence: 99%
“…One direction of uncertainty quantification for PINN is to build a probabilistic neural network 𝑝 𝜽 (𝒖|𝒙, 𝑑, 𝒛) for propagating uncertainty originating from various sources, where 𝒖 (𝒙, 𝑑) represents the spatio-temporal random field with the explanatory spatial variable 𝒙 and temporal variable 𝑑. 𝒛 is a latent variable that follows a prior distribution 𝑝 (𝒛) and aim to encode the variability of the high-dimensional observation 𝒖 into low-dimensional embedding. many approaches based on the deep generative model for uncertainty quantification have been proposed to leverage their capability of probabilistic modeling for structured outputs [26,39,99].…”
Section: Uq For Physics-aware Dnn Modelsmentioning
confidence: 99%