2020
DOI: 10.48550/arxiv.2007.06823
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Hands-on Bayesian Neural Networks -- a Tutorial for Deep Learning Users

Laurent Valentin Jospin,
Wray Buntine,
Farid Boussaid
et al.

Abstract: Modern deep learning methods have equipped researchers and engineers with incredibly powerful tools to tackle problems that previously seemed impossible. However, since deep learning methods operate as black boxes, the uncertainty associated with their predictions is often challenging to quantify. Bayesian statistics offer a formalism to understand and quantify the uncertainty associated with deep neural networks predictions. This paper provides a tutorial for researchers and scientists who are using machine l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
76
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 50 publications
(82 citation statements)
references
References 28 publications
0
76
0
Order By: Relevance
“…Many models have attempted to incorporate Bayesian elements into deep learning (see e.g. Bayesian networks [MacKay, 1992, Jospin et al, 2020, dropout recovering deep Gaussian process [Gal and Ghahramani, 2016], others [Osawa et al, 2019]). Often, these models require costly Monte Carlo training strategies, while the current approach can be trained with gradient descent.…”
Section: Introductionmentioning
confidence: 99%
“…Many models have attempted to incorporate Bayesian elements into deep learning (see e.g. Bayesian networks [MacKay, 1992, Jospin et al, 2020, dropout recovering deep Gaussian process [Gal and Ghahramani, 2016], others [Osawa et al, 2019]). Often, these models require costly Monte Carlo training strategies, while the current approach can be trained with gradient descent.…”
Section: Introductionmentioning
confidence: 99%
“…Alternative methods are based, indicatively, on ensembles of NN optimization iterates or independently trained NNs [39][40][41][42][43][44][45][46][47][48][49][50], as well as on the evidential framework [51][52][53][54][55][56][57][58][59]. Although Bayesian methods and ensembles are thoroughly discussed in this paper, the interested reader is also directed to the recent review studies in [60][61][62][63][64][65][66][67][68][69][70][71][72] for more information. Clearly, in the context of SciML, which may involve differential equations with unknown or uncertain terms and parameters, UQ becomes an even more demanding task; see Fig.…”
Section: Motivation and Scope Of The Papermentioning
confidence: 99%
“…BNNs constitute a new direction in machine learning [16]. By connecting Bayesian statistics and deep learning, BNNs combine the benefits of Bayesian uncertainty quantification with the predictive power of NNs.…”
Section: A Model Selectionmentioning
confidence: 99%