2021
DOI: 10.48550/arxiv.2112.03749
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Interpolating between BSDEs and PINNs: deep learning for elliptic and parabolic boundary value problems

Abstract: Solving high-dimensional partial differential equations is a recurrent challenge in economics, science and engineering. In recent years, a great number of computational approaches have been developed, most of them relying on a combination of Monte Carlo sampling and deep learning based approximation. For elliptic and parabolic problems, existing methods can broadly be classified into those resting on reformulations in terms of backward stochastic differential equations (BSDEs) and those aiming to minimize a re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 67 publications
(136 reference statements)
0
2
0
Order By: Relevance
“…In contrast, the deep-BSDE methods exploit the intrinsic connection between PDEs and BSDEs in the form of the nonlinear Feynman-Kac formula to recast s PDE as a stochastic control problem that is solved by reinforcement learning techniques [18,19,20,21,22]. See [23] for a recent work on interpolating between PINNs and deep BSDEs. Finally, energy methods construct the loss function by taking advantage of the variational formulation of elliptic PDEs [24,25,26,27].…”
Section: Introductionmentioning
confidence: 99%
“…In contrast, the deep-BSDE methods exploit the intrinsic connection between PDEs and BSDEs in the form of the nonlinear Feynman-Kac formula to recast s PDE as a stochastic control problem that is solved by reinforcement learning techniques [18,19,20,21,22]. See [23] for a recent work on interpolating between PINNs and deep BSDEs. Finally, energy methods construct the loss function by taking advantage of the variational formulation of elliptic PDEs [24,25,26,27].…”
Section: Introductionmentioning
confidence: 99%
“…In the last years, significant progress has been achieved towards these challenges with several numerical methods using techniques from deep learning, see the recent surveys by [3] and [8]: A first class of approximation algorithms, called Physics Informed Neural Network (PINNs) [28], also known as Deep Galerkin method (DGM) [30], directly approximates the solution to the PDE by a neural network, and its partial derivatives by automatic differentiation, by minimizing the loss function arising from the residual of the PDE evaluated on a random grid in the spacetime domain. A second class of algorithms relies on the backward stochastic differential representation of the PDE in the semi-linear case by minimizing either a global loss function (see [6], and extensions in [2], [17], [24]), or sequence of loss functions from backward recursion (see [16], and variations-extensions in [25], [1], [8]).…”
Section: Introductionmentioning
confidence: 99%