2021
DOI: 10.1137/20m1366587
|View full text |Cite
|
Sign up to set email alerts
|

Galerkin Neural Networks: A Framework for Approximating Variational Equations with Error Control

Abstract: We present extended Galerkin neural networks (xGNN), a variational framework for approximating general boundary value problems (BVPs) with error control. The main contributions of this work are (1) a rigorous theory guiding the construction of new weighted least squares variational formulations suitable for use in neural network approximation of general BVPs (2) an "extended" feedforward network architecture which incorporates and is even capable of learning singular solution structures, thus greatly improving… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 17 publications
(7 citation statements)
references
References 29 publications
0
5
0
Order By: Relevance
“…For example, Fourier transforms, which are employed in the neural network and to represent derivatives in the loss function, are known to be highly sensitive to discontinuities. Thus, potential improvements may encompass the use of Galerkin neural networks [44] to better handle discontinuities, and loss functions that incorporate particle number conservation. These improvements should be considered in future work.…”
Section: Burgers Equation 2d Inviscid Resultsmentioning
confidence: 99%
“…For example, Fourier transforms, which are employed in the neural network and to represent derivatives in the loss function, are known to be highly sensitive to discontinuities. Thus, potential improvements may encompass the use of Galerkin neural networks [44] to better handle discontinuities, and loss functions that incorporate particle number conservation. These improvements should be considered in future work.…”
Section: Burgers Equation 2d Inviscid Resultsmentioning
confidence: 99%
“…Many approaches have been proposed to mitigate these failures, e.g. [49][50][51][52][53][54][55][56][57][58][59][60], and we note that many of these can be applied simultaneously with the continual learning approach proposed here. For this paper we focus on PINNs applied alone with continual learning to show the improvement that continual learning can offer.…”
Section: Physics-informed Trainingmentioning
confidence: 99%
“…However, as in other ML-based approaches, they are not able to accurately predict numerical solutions to the stiff PDEs which exhibit a sharp transition near the boundary. Recently, Galerkin Neural Network (GNN) was introduced to learn a test function for a single instance in [55], [56] and solve a boundary layer issue arose from reaction-diffusion types of singular perturbation problems.…”
Section: Related Workmentioning
confidence: 99%