2019
DOI: 10.1137/18m1191944
|View full text |Cite
|
Sign up to set email alerts
|

Data-Driven Identification of Parametric Partial Differential Equations

Abstract: In this work we present a data-driven method for the discovery of parametric partial differential equations (PDEs), thus allowing one to disambiguate between the underlying evolution equations and their parametric dependencies. Group sparsity is used to ensure parsimonious representations of observed dynamics in the form of a parametric PDE, while also allowing the coefficients to have arbitrary time series, or spatial dependence. This work builds on previous methods for the identification of constant coeffici… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
178
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 250 publications
(185 citation statements)
references
References 74 publications
1
178
0
Order By: Relevance
“…Figure 6 shows the predicted mean and standard deviation of the solution u versus the reference. Figure 7 shows our DNN prediction of three u modes where the reference modes are calculated by Equation 17. We can see that the NN-aPC method makes accurate predictions of the mean and standard deviation of the solution u(x; ω) and learns the arbitrary polynomial chaos modes.…”
Section: Forward Problem: Stochastic Poisson Equation (A Pedagogical mentioning
confidence: 96%
See 1 more Smart Citation
“…Figure 6 shows the predicted mean and standard deviation of the solution u versus the reference. Figure 7 shows our DNN prediction of three u modes where the reference modes are calculated by Equation 17. We can see that the NN-aPC method makes accurate predictions of the mean and standard deviation of the solution u(x; ω) and learns the arbitrary polynomial chaos modes.…”
Section: Forward Problem: Stochastic Poisson Equation (A Pedagogical mentioning
confidence: 96%
“…For example, for the forward problem, some of the popular choices of machine learning tools are Gaussian process [3,4,5,6,7] and deep neural networks (DNNs) [8,9,10,11,12]. For inverse problems, similar methods have been advanced, e.g., Bayesian estimation [13] and variational Bayes inference [14], and have been proposed for a wide variety of objectives, from parameter estimation [15] to discovering partial differential equations [16,17,18,19] to learning constitutive relationships [20].…”
Section: Introductionmentioning
confidence: 99%
“…SINDy solves an overdetermined linear system of equations by sparsity-promoting regularization. The basic algorithmic structure of SINDy has been modified to discover parametrically-dependent systems [40], resolve multiscale physics [8], infer biological networks [34], discover spatiotemporal systems [41], and identify nonlinear systems with control [6,22]. Consider a dynamical system of the forṁ…”
Section: Sensorsmentioning
confidence: 99%
“…Emerging data-driven methods are allowing for the discovery of physical and engineering principles directly arXiv:1809.05707v1 [nlin.PS] 15 Sep 2018 from time-series recordings. Our focus is on the SINDy architecture [5], which has been demonstrated on a diverse set of problems, including spatio-temporal [41], parametric [40], networked [34], control [6], and multiscale [8] systems. Importantly, the SINDy architecture can be directly related to model selection theory [34] in order to assess the quality and robustness of the model discovered.…”
Section: Introductionmentioning
confidence: 99%
“…The proposed framework replaces the guessing work often involved in such model development with a data-driven approach that uncovers the closure from data in a systematic fashion. Our approach draws inspiration from the early and contemporary contributions in deep learning for partial differential equations [14][15][16][17][18][19][20] and data-driven modeling strategies [21][22][23], and in particular relies on recent developments in physics-informed deep learning [24] and deep hidden physics models [25].…”
Section: Introductionmentioning
confidence: 99%