2018
DOI: 10.1063/1.5018409
|View full text |Cite
|
Sign up to set email alerts
|

Sparse learning of stochastic dynamical equations

Abstract: With the rapid increase of available data for complex systems, there is great interest in the extraction of physically relevant information from massive datasets. Recently, a framework called Sparse Identification of Nonlinear Dynamics (SINDy) has been introduced to identify the governing equations of dynamical systems from simulation data. In this study, we extend SINDy to stochastic dynamical systems which are frequently used to model biophysical processes. We prove the asymptotic correctness of stochastic S… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
216
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 203 publications
(218 citation statements)
references
References 28 publications
0
216
0
Order By: Relevance
“…The discovered eigenfunctions are then used for control, resulting in the so-called KRONIC framework. Another extension of SINDy was derived in [21], allowing for the identification of parameters of a stochastic system using Kramers-Moyal formulae.A different avenue towards system identification was taken in [22,23]. Here, the Koopman operator is first approximated with the aid of EDMD, and then its generator is determined using the matrix logarithm.…”
mentioning
confidence: 99%
“…The discovered eigenfunctions are then used for control, resulting in the so-called KRONIC framework. Another extension of SINDy was derived in [21], allowing for the identification of parameters of a stochastic system using Kramers-Moyal formulae.A different avenue towards system identification was taken in [22,23]. Here, the Koopman operator is first approximated with the aid of EDMD, and then its generator is determined using the matrix logarithm.…”
mentioning
confidence: 99%
“…Comparing the results for λ = 0.01, we observe that the solutions are slightly sparser for λ = 0.2 and λ = 0.1 (e.g., coefficients corresponding to the basis φ 1 ≡ 1 in Table 10), while the approximation of the true coefficients is better when λ is smaller (i.e., λ = 0.01, underlined coefficients in Table 10). Finally, we point out that the solution could be further improved if necessary, by using thresholding techniques (i.e., removing unimportant basis functions) [10] or cross-validation techniques (i.e., tuning λ) [8].…”
Section: Example 2: Predator-prey Systemmentioning
confidence: 99%
“…Let us first review related work and summarize the contributions of this paper. The reconstruction of the governing equations from data using sparsity constraints is getting more and more attention, see [53,10,37,14] for methods pertaining to ordinary differential equations (ODEs) and [8] for learning stochastic differential equations (SDEs). For chemical and biological reaction systems, the problem of estimating unknown parameters has been well studied when the systems are modeled both as ODEs [28,3] and as continuous-time Markov chain processes [1,41,9,54], while the reconstruction of the entire chemical reaction networks, i.e., finding parsimonious models, has only been considered when the systems are modeled as ODE systems [51,37,14].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Sparse learning of dynamical equations has been studied in Refs. [33,34]. Most of the previous work was aimed at recovering correct thermodynamics of the reduced system, that is, the distribution sampled by the effective dynamics should equal the distribution of the projected original process.…”
Section: Introductionmentioning
confidence: 99%