2020
DOI: 10.48550/arxiv.2012.05343
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Thermodynamically consistent physics-informed neural networks for hyperbolic systems

Abstract: Physics-informed neural network architectures have emerged as a powerful tool for developing flexible PDE solvers which easily assimilate data, but face challenges related to the PDE discretization underpinning them. By instead adapting a least squares space-time control volume scheme, we circumvent issues particularly related to imposition of boundary conditions and conservation while reducing solution regularity requirements. Additionally, connections to classical finite volume methods allows application of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 63 publications
0
6
0
Order By: Relevance
“…Physics-Guided Machine Learning. Physics-guided machine learning is becoming an increasingly common method for solving problems in a wide variety of physics dependent fields such as fluid mechanics [7,20,28,34], electromagnetism [19,26,37], thermodynamic modeling [2,16,27,36], and even in medical engineering [29]. By imposing physical constraints to respect any symmetries [35], invariances [20], or conservation principles [4], researchers are able to constrain the space of admissible solutions to a manageable size even with a few hundred data-points.…”
Section: Related Workmentioning
confidence: 99%
“…Physics-Guided Machine Learning. Physics-guided machine learning is becoming an increasingly common method for solving problems in a wide variety of physics dependent fields such as fluid mechanics [7,20,28,34], electromagnetism [19,26,37], thermodynamic modeling [2,16,27,36], and even in medical engineering [29]. By imposing physical constraints to respect any symmetries [35], invariances [20], or conservation principles [4], researchers are able to constrain the space of admissible solutions to a manageable size even with a few hundred data-points.…”
Section: Related Workmentioning
confidence: 99%
“…To perform Gaussian process regression, we suppose that the behavior of the function of interest is described by some covariance function k(x, x ). The posterior mean prediction (7) in GPR for a function u at a point x, given data (X, y) = {(x i , y i )} N i=1 , can be written as…”
Section: Eigenfunction Expansion Kernel Functions For Boundary Condit...mentioning
confidence: 99%
“…the mean and variance of the GP posterior prediction for f * = f (x * ), formulas (7) and (8), respectively, can be written as…”
Section: Combining Boundary Value and Linear Pde Constraintsmentioning
confidence: 99%
See 1 more Smart Citation
“…A number of scientific machine learning (ML) tasks seek to discover a dynamical system whose solution is consistent with data (e.g. constitutive modeling (Patel et al 2020;Karapiperis et al 2021;Ghnatios et al 2019;Masi et al 2021), reduced-order modeling ; Lee and Carlberg 2021;Wan et al 2018), physics-informed machine learning (Karniadakis et al 2021;Wu, Xiao, and Paterson 2018), and surrogates for performing optimal control (Alexopoulos, Nikolakis, and Chryssolouris 2020)). A major challenge for this class of problems is the preservation of both numerical stability and physical realizability when performing out of distribution inference (i.e.…”
Section: Introductionmentioning
confidence: 99%