2020
DOI: 10.1103/physrevd.102.103509
|View full text |Cite
|
Sign up to set email alerts
|

Parameter estimation for the cosmic microwave background with Bayesian neural networks

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
27
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 30 publications
(27 citation statements)
references
References 54 publications
0
27
0
Order By: Relevance
“…We certainly need to better understand the predictive uncertainty. This may include by dividing it into epistemic uncertainty and aleatoric uncertainty (Kiureghian & Ditlevsen 2009;Gal 2016;Hortúa et al 2020). The epistemic uncertainty is reducible by increasing the amount of observations as the limited training set can be insufficient for the entire feature space, while aleatoric uncertainty captures the noise of intrinsic randomness (such as photon noise) and cannot be reduced by collecting more data.…”
Section: Discussionmentioning
confidence: 99%
“…We certainly need to better understand the predictive uncertainty. This may include by dividing it into epistemic uncertainty and aleatoric uncertainty (Kiureghian & Ditlevsen 2009;Gal 2016;Hortúa et al 2020). The epistemic uncertainty is reducible by increasing the amount of observations as the limited training set can be insufficient for the entire feature space, while aleatoric uncertainty captures the noise of intrinsic randomness (such as photon noise) and cannot be reduced by collecting more data.…”
Section: Discussionmentioning
confidence: 99%
“…Alternatively, ResUNet-CMB could be used as the core component to construct a Bayesian neural network [92] that is capable of producing a probabilistic distribution of outputs. Bayesian neural net-works have a growing presence in cosmology with applications such as parameter and uncertainty estimation with the CMB [93,94] and with strong gravitational lensing [95,96]. Incorporating ResUNet-CMB into a Bayesian neural network would allow for predictions with well-characterized posterior probabilities.…”
Section: Discussionmentioning
confidence: 99%
“…Assuming the variational distribution can be expressed as a mean plus a perturbation, by randomly multiplying each perturbation by either {1, −1} one can ensure that the weights across a batch are at least partially decorrelated. This method has proven to be effective in recent applications of BNNs [42,43], with the additional advantage of being available as pre-built implementations in popular Deep Learning libraries like TensorFlow 1 for both dense and convolutional layers [61]. In this paper we make use of TensorFlow [62] and TensorFlow Probability [61] throughout.…”
Section: Classification In Bnnsmentioning
confidence: 99%
“…Once a probability distribution is defined, BNNs are also able to determine whether an example does not belong to any of the classes in the training set. BNNs have recently been applied in many fields such as gravitational waves [41,42], the Cosmic Microwave Background [43], autonomous driving [44] and cellular image classification [45].…”
Section: Introductionmentioning
confidence: 99%