2022 26th International Conference on Pattern Recognition (ICPR) 2022
DOI: 10.1109/icpr56361.2022.9956231
|View full text |Cite
|
Sign up to set email alerts
|

AutoDEUQ: Automated Deep Ensemble with Uncertainty Quantification

Abstract: Deep neural networks are powerful predictors for a variety of tasks. However, they do not capture uncertainty directly. Using neural network ensembles to quantify uncertainty is competitive with approaches based on Bayesian neural networks while benefiting from better computational scalability. However, building ensembles of neural networks is a challenging task because, in addition to choosing the right neural architecture or hyperparameters for each member of the ensemble, there is an added cost of training … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(12 citation statements)
references
References 13 publications
0
6
0
Order By: Relevance
“…In [16], the authors pursue the same heuristic endeavor, but take an ensemble route. They too consider different BNNs, but instead of keeping them separate and use them to build a posterior credal set, they average them out.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…In [16], the authors pursue the same heuristic endeavor, but take an ensemble route. They too consider different BNNs, but instead of keeping them separate and use them to build a posterior credal set, they average them out.…”
Section: Methodsmentioning
confidence: 99%
“…Results. Following [16], for EBNN we posit that 1/k k j=1 σ 2 j captures the aleatoric uncertainty, and 1/(k − 1) k j=1 (µ j − µ ens ) 2 captures the epistemic uncertainty; we use these values as baselines. 10 We discuss the results for CIFAR10 presented in Table 3.…”
Section: Appendix E Bounds On Upper and Lower Entropymentioning
confidence: 99%
See 1 more Smart Citation
“…Uncertainty Quantification (UQ) is an active field of research and various methods have been proposed to efficiently estimate the uncertainty of machine learning models (see Abdar et al 2021 for an extensive overview). While most research focuses on Bayesian deep learning (Srivastava et al 2014;Blundell et al 2015;Sensoy, Kandemir, and Kaplan 2018;Fan et al 2020;Järvenpää, Vehtari, and Marttinen 2020;Charpentier, Zügner, and Günnemann 2020), deep ensemble methods, which benefit from the advantages of both deep learning and ensemble learning, have been recently leveraged for empirical uncertainty quantification (Egele et al 2021;Hoffmann, Fortmeier, and Elster 2021;Brown, Bhuiyan, and Talbert 2020;Althoff, Rodrigues, and Bazame 2021). Although Bayesian UQ methods have solid theoretical foundation, they often require significant changes to the training procedure and are computationally expensive compared to non-Bayesian techniques such as ensembles (Egele et al 2021;Rahaman and Thiery 2021;Lakshminarayanan, Pritzel, and Blundell 2017).…”
Section: Related Workmentioning
confidence: 99%
“…This enables the controller to avoid state spaces with high uncertainty owing to noisy exteroceptive input or insufficient model learning. UC(x) is defined as the total amount of uncertainty inherent in the model prediction, which can be expressed as the sum of aleatoric uncertainty AU(x) and epistemic uncertainty EU(x) as follows [29]:…”
Section: B Off-road Cost Function Designmentioning
confidence: 99%