2020
DOI: 10.48550/arxiv.2003.06097
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

B-PINNs: Bayesian Physics-Informed Neural Networks for Forward and Inverse PDE Problems with Noisy Data

Liu Yang,
Xuhui Meng,
George Em Karniadakis

Abstract: We propose a Bayesian physics-informed neural network (B-PINN) to solve both forward and inverse nonlinear problems described by partial differential equations (PDEs) and noisy data. In this Bayesian framework, the Bayesian neural network (BNN) combined with a PINN for PDEs serves as the prior while the Hamiltonian Monte Carlo (HMC) or the variational inference (VI) could serve as an estimator of the posterior. B-PINNs make use of both physical laws and scattered noisy measurements to provide predictions and q… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
21
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(21 citation statements)
references
References 23 publications
0
21
0
Order By: Relevance
“…The incorporation of data from multiple sensor types, such as active piezo-electric sensors can also be naturally implemented in the GNN framework. In addition, GNN-based computation may be a viable alternative to Bayesian PDE inversion problems [13], where observations are sparse, in place of the grid-based CNN computational substrate. Finally, GNNs offer great potential for inference of data-driven Reduced Order Models (ROMs), directly from physical observations.…”
Section: Discussionmentioning
confidence: 99%
“…The incorporation of data from multiple sensor types, such as active piezo-electric sensors can also be naturally implemented in the GNN framework. In addition, GNN-based computation may be a viable alternative to Bayesian PDE inversion problems [13], where observations are sparse, in place of the grid-based CNN computational substrate. Finally, GNNs offer great potential for inference of data-driven Reduced Order Models (ROMs), directly from physical observations.…”
Section: Discussionmentioning
confidence: 99%
“…In particular, the Hamiltonian Monte Carlo (HMC) method [16], which is frequently used as ground truth for posterior estimation, is adopted here to estimate the posterior distributions in BNN [17]. In general, the estimation of posterior distributions for hyperparameters in the BNN is computationally prohibitive for problems with big data using HMC [14]. However, the high-fidelity data are generally scarce due to high acquisition cost associated with high-fidelity data.…”
Section: Multi-fidelity Bayesian Neural Networkmentioning
confidence: 99%
“…Generally, P (D) is analytically intractable. Here, we employ HMC to sample from the unnormalized P (θ|D) [14,16]. Then, we can obtain predictions on u at any x (i.e., {ũ (i) (x)} M i=1 ) based on the posterior samples (i.e., {θ (i) } M i=1 ) from the HMC.…”
Section: Multi-fidelity Bayesian Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…It is particularly useful for systems where there is a high cost of data acquisition or lack of high resolution data [16]. The authors in [17] proposed a Bayesian approach for physics informed neural network to solve forward and inverse problems.…”
Section: Introductionmentioning
confidence: 99%