2021
DOI: 10.48550/arxiv.2105.14094
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Galerkin Neural Networks: A Framework for Approximating Variational Equations with Error Control

Abstract: We present a new approach to using neural networks to approximate the solutions of variational equations, based on the adaptive construction of a sequence of finite-dimensional subspaces whose basis functions are realizations of a sequence of neural networks. The finite-dimensional subspaces are then used to define a standard Galerkin approximation of the variational equation. This approach enjoys a number of advantages, including: the sequential nature of the algorithm offers a systematic approach to enhancin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…The current approach adopts the probabilistic viewpoint with the aim of improving approximation, while the previously cited works generally focus on quantifying uncertainty. In a deterministic context several works have pursued other strategies to realize convergence in deep networks [He et al, 2018, Cyr et al, 2020, Adcock and Dexter, 2021, Fokina and Oseledets, 2019, Ainsworth and Dong, 2021. In the context of ML for reduced basis construction, several works have focused primarily on using either Gaussian processes and PCA [Guo and Hesthaven, 2018] or classical/variational autoencoders as replacements for PCA Carlberg, 2020, Lopez andAtzberger, 2020] in classical ROM schemes; this is distinct from the control volume type surrogates considered in which requires a reduced basis corresponding to a partition of space.…”
Section: Introductionmentioning
confidence: 99%
“…The current approach adopts the probabilistic viewpoint with the aim of improving approximation, while the previously cited works generally focus on quantifying uncertainty. In a deterministic context several works have pursued other strategies to realize convergence in deep networks [He et al, 2018, Cyr et al, 2020, Adcock and Dexter, 2021, Fokina and Oseledets, 2019, Ainsworth and Dong, 2021. In the context of ML for reduced basis construction, several works have focused primarily on using either Gaussian processes and PCA [Guo and Hesthaven, 2018] or classical/variational autoencoders as replacements for PCA Carlberg, 2020, Lopez andAtzberger, 2020] in classical ROM schemes; this is distinct from the control volume type surrogates considered in which requires a reduced basis corresponding to a partition of space.…”
Section: Introductionmentioning
confidence: 99%
“…We emphasize that the procedure we propose can, in principle, complement any operator regression technique that can furnish high-quality spatial functions, e.g., [17][18][19] . Our technique can also be seen in the context of several important methodologies developed recently combining deep learning methods with variational formulations of PDEs [20][21][22] .…”
Section: Machine-learning-based Spectral Methods For Partial Differen...mentioning
confidence: 99%
“…However, their physics-constrained loss can lead to discretization artifacts and requires a pressure gradient regularizer for high Reynolds numbers. Further learning-based approaches also exploit graph representations in terms of graph neural networks [Harsch andRiedelbauch 2021, Sanchez-Gonzalez et al 2020], graph convolutional networks [Gao, Zahr, and Wang 2021] as well as mesh representations [Pfaff et al 2021] or subspace representations Spiliopoulos 2018b, Ainsworth andDong 2021]. Unfortunately, graph neural networks cannot make use of the highly efficient implementations for convolutional operations on grids and thus usually come with higher computational complexity per node compared to grid based CNNs.…”
Section: Related Workmentioning
confidence: 99%