2020
DOI: 10.1016/j.jcp.2020.109345
|View full text |Cite
|
Sign up to set email alerts
|

Constraint-aware neural networks for Riemann problems

Abstract: Neural networks are increasingly used in complex (data-driven) simulations as surrogates or for accelerating the computation of classical surrogates. In many applications physical constraints, such as mass or energy conservation, must be satis ed to obtain reliable results. However, standard machine learning algorithms are generally not tailored to respect such constraints. We propose two di erent strategies to generate constraint-aware neural networks. We test their performance in the context of front-capturi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
26
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 54 publications
(26 citation statements)
references
References 36 publications
(95 reference statements)
0
26
0
Order By: Relevance
“…This is the approach followed by several authors in the context of physical simulations, which aim to solve a set of partial differential equations (PDEs) in complex dynamical systems. Physical problems must satisfy inherently certain conditions dictated by physics, often formulated as conservation laws, and can be imposed to a neural network using extra loss terms in the constrained optimization process [32].…”
Section: Introductionmentioning
confidence: 99%
“…This is the approach followed by several authors in the context of physical simulations, which aim to solve a set of partial differential equations (PDEs) in complex dynamical systems. Physical problems must satisfy inherently certain conditions dictated by physics, often formulated as conservation laws, and can be imposed to a neural network using extra loss terms in the constrained optimization process [32].…”
Section: Introductionmentioning
confidence: 99%
“…The overview paper by Karpatne et al (2017) provides a taxonomy for theory-guided data science, with the goal of incorporating scientific consistency in the learning of generalizable models. Much research in physics-informed machine learning has focused on incorporating constraints in neural networks (Ling et al, 2016;Jones et al, 2018), often through the use of objective/loss functions that penalize constraint violation (Magiera et al, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…Several studies have been conducted to model the dynamics of chaotic fluid flow using ML algorithms [25][26][27][28][29][30][31]. Recently there is a growing interest in using the physics of the problem in combination with the data-driven algorithms [28,[32][33][34][35][36][37][38]. The physics can be incorporated into these learning algorithms by adding a regularization term (based on governing equations) in loss function or modifying the neural network architecture to enforce certain physical constraints.In addition to reduced order modeling and chaotic dynamical systems, the turbulence closure problem has also benefited from the application of ML algorithms and has led to reducing uncertainties in .…”
mentioning
confidence: 99%