2018
DOI: 10.48550/arxiv.1802.04893
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Uncertainty Estimation via Stochastic Batch Normalization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
1

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(10 citation statements)
references
References 0 publications
0
9
1
Order By: Relevance
“…15. In another research, Atanov et al [165] introduced a probabilistic model and showed that Batch Normalization (BN) approach can maximize the lower bound of its related marginalized log-likelihood. Since inference computationally was not efficient, they proposed the Stochastic BN (SBN) approach for approximation of appropriate inference procedure, as an uncertainty estimation method.…”
Section: Deep Gaussian Processesmentioning
confidence: 99%
“…15. In another research, Atanov et al [165] introduced a probabilistic model and showed that Batch Normalization (BN) approach can maximize the lower bound of its related marginalized log-likelihood. Since inference computationally was not efficient, they proposed the Stochastic BN (SBN) approach for approximation of appropriate inference procedure, as an uncertainty estimation method.…”
Section: Deep Gaussian Processesmentioning
confidence: 99%
“…The uncertainty caused by the parameters in a neural network is known as epistemic uncertainty, which is modeled by placing a prior distribution (e.g., a Gaussian prior distribution: W ∼ N(0, I)) on the parameters of a network and then attempting to capture how much these weights vary given specific data. Recent efforts in this area include the Bayes by Backprop (Blundell et al 2015), its closely related mean-field variational inference by assuming a Gaussian prior distribution (Tölle et al 2021), stochastic batch normalization (Atanov et al 2018), and Monte-Carlo (MC) dropout (Gal and Ghahramani 2016;Kendall and Gal 2017). The applications of Bayesian deep learning in medical imaging expands on image denoising (Tölle et al 2021;Laves et al 2020b) and image segmentation (DeVries and Taylor 2018; Baumgartner et al 2019;Mehrtash et al 2020).…”
Section: Bayesian Deep Learningmentioning
confidence: 99%
“…Various approximations of BNNs exist that estimate the true posterior p(w|F, Y ), with a variational distribution q(w), which is feasible to train. Popular BNN approximations include stochastic variational inference [52]- [54], multiplicative normalizing flows [55], stochastic batch normalization [56], and variational inference by Monte-Carlo (MC) dropout [57], [58]. Also, non-Bayesian approaches generally train many deep ensemble or bootstrap models [59], [60] to measure the uncertainty through predictive variance.…”
Section: E Epistemic Uncertaintymentioning
confidence: 99%