2021
DOI: 10.1073/pnas.2020397118
|View full text |Cite
|
Sign up to set email alerts
|

Inference of dynamic systems from noisy and sparse data via manifold-constrained Gaussian processes

Abstract: Parameter estimation for nonlinear dynamic system models, represented by ordinary differential equations (ODEs), using noisy and sparse data, is a vital task in many fields. We propose a fast and accurate method, manifold-constrained Gaussian process inference (MAGI), for this task. MAGI uses a Gaussian process model over time series data, explicitly conditioned on the manifold constraint that derivatives of the Gaussian process must satisfy the ODE system. By doing so, we completely bypass the need for numeri… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
29
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 23 publications
(34 citation statements)
references
References 12 publications
0
29
0
Order By: Relevance
“…Gaussian Processes have been used to identify SDE from observations [52], with a target likelihood based on the Euler-Maruyama scheme (not explicitly mentioned by the authors). Our setting is very different from identifying ODE from noisy observations, for example through Gaussian Processes [51]. Generative stochastic modeling of strongly nonlinear flows with non-Gaussian statistics is also possible [3].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Gaussian Processes have been used to identify SDE from observations [52], with a target likelihood based on the Euler-Maruyama scheme (not explicitly mentioned by the authors). Our setting is very different from identifying ODE from noisy observations, for example through Gaussian Processes [51]. Generative stochastic modeling of strongly nonlinear flows with non-Gaussian statistics is also possible [3].…”
Section: Related Workmentioning
confidence: 99%
“…Minimizing L in (5) over the data D implies maximization of the log marginal likelihood (4) with the constant terms removed (as they do not influence the minimization) [41]. Likelihood estimation in combination with the normal distribution is used in many variational and generative approaches [19,28,33,52,51,49]. Note that here, the step size h (k) is defined per snapshot, so it is possible that it has different values for every index k. This will be especially useful in coarse-graining Gillespie simulations, where the time step is determined as part of the scheme.…”
Section: Identification Of Drift and Diffusivity With The Euler-maruy...mentioning
confidence: 99%
“…The explicit derivative information is further utilized to improve a general GP's performance for data that are generated from differential equations [24]. The derivative in a given system of differential equations is further harnessed through a constraint manifold such that the derivatives of the Gaussian process must match an ordinary differential Equation (ODE) [25]. Despite their success, these works generally require explicitly known differential equations to work.…”
Section: Introductionmentioning
confidence: 99%
“…A closely related work is [26], where GP is used as a generalization for a parametric function for binary images. However, their work cannot be directly implemented in our problem because our systems of equations will lead to a mixture of GPs that are augmented by the derivative of concentrations, whereas there is normally only one GP to estimate in most of the previous works [23][24][25][26][27].…”
Section: Introductionmentioning
confidence: 99%
“…This has motivated a growing body of research considering interacting agent systems on various manifolds [19,6,30], including opinion dynamics [2], flocking models [1] and a classical aggregation model [5]. Further recent approaches for interacting agents on manifolds include [37,32].…”
Section: Introductionmentioning
confidence: 99%