2017
DOI: 10.1016/j.jcp.2017.07.050
|View full text |Cite
|
Sign up to set email alerts
|

Machine learning of linear differential equations using Gaussian processes

Abstract: This work leverages recent advances in probabilistic machine learning to discover conservation laws expressed by parametric linear equations. Such equations involve, but are not limited to, ordinary and partial differential, integro-differential, and fractional order operators. Here, Gaussian process priors are modified according to the particular form of such operators and are employed to infer parameters of the linear equations from scarce and possibly noisy observations. Such observations may come from expe… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
333
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 506 publications
(363 citation statements)
references
References 31 publications
0
333
0
Order By: Relevance
“…In Raissi et al [2017b], a data-driven algorithm for learning the coefficients of general parametric linear differential equations from noisy data was introduced, solving a so-called inverse problem. The approach employs GP priors that are tailored to the corresponding and known type of differential operators, resulting in design and algorithmic transparency.…”
Section: Prediction Of Scientific Parameters and Propertiesmentioning
confidence: 99%
“…In Raissi et al [2017b], a data-driven algorithm for learning the coefficients of general parametric linear differential equations from noisy data was introduced, solving a so-called inverse problem. The approach employs GP priors that are tailored to the corresponding and known type of differential operators, resulting in design and algorithmic transparency.…”
Section: Prediction Of Scientific Parameters and Propertiesmentioning
confidence: 99%
“…Two immediate opportunities for machine learning in multiscale modeling include learning the underlying physics [104] and learning the parameters for a known physics-based problem. Recent examples of learning the underlying physics are the data-driven solution of problems in elasticity [23] and the data-driven discovery of partial differential equations for nonlinear dynamical systems [14,95,98]. This class of problems holds great promise, especially in combination with deep learning, but involves a thorough understanding and direct interaction with the underlying learning machines [100].…”
Section: Motivationmentioning
confidence: 99%
“…Similarly, concatenate s 0 and all the s e into a single vector s, defining K as the space of all s that have sub-vector s 0 ∈ Q 3n+1 and all sub-vectors s e ∈ Q 4 . With this notation, Equation 3 can be written more formally as a second order cone program (SOCP):…”
Section: Inextensibility Priormentioning
confidence: 99%
“…Since a number of authors have begun to consider the use of machine/deep learning for problems in traditional computational physics, see e.g. [1,2,3,4,5,6,7,8,9,10,11,12], we are motivated to consider methodologies that constrain the interpolatory results of a network to be contained within a physically admissible region. Quite recently, [13] proposed adding physical constraints to generative adversarial networks (GANs) also considering projection as we do, while stressing the interplay between scientific computing and machine learning; we refer the interested reader to their work for even more motivation for such approaches.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation