2018
DOI: 10.48550/arxiv.1806.07366
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Ordinary Differential Equations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

2
315
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 175 publications
(318 citation statements)
references
References 0 publications
2
315
0
1
Order By: Relevance
“…We first estimate the drift g from the observation data using the variational formulation for the stationary Fokker-Planck equation ( 9) through relative entropy rate. Suppose that the stationary probability density p g (x) is given and we search for a estimator of the drift g(x) by minimizing the relative entropy rate (6). Introducing a Lagrange multiplier function ψ(x), we may derive the drift g(x) from the following Lagrange functional…”
Section: Variational Formulationmentioning
confidence: 99%
See 1 more Smart Citation
“…We first estimate the drift g from the observation data using the variational formulation for the stationary Fokker-Planck equation ( 9) through relative entropy rate. Suppose that the stationary probability density p g (x) is given and we search for a estimator of the drift g(x) by minimizing the relative entropy rate (6). Introducing a Lagrange multiplier function ψ(x), we may derive the drift g(x) from the following Lagrange functional…”
Section: Variational Formulationmentioning
confidence: 99%
“…There exist many different forms in regard to data-driven methods, such as parametric and nonparametric approaches [6][7][8][9][10]. The recent sparse identification of nonlinear dynamics method, which proposed by Brunton and co-workers [11], is a scriptures of parametric approaches.…”
Section: Introductionmentioning
confidence: 99%
“…We do not, however, have to resign ourselves to treating them as black boxes. Driven in part by the desire to derive physical intuition from deep methods, interrogation and interpretation methods for deep learning are an active and important area of research (e.g., Doshi-Velez & Kim 2017;Slavin Ross et al 2017;Chen et al 2018;Rudin 2019;Winkler et al 2019;Wu et al 2020;Yu et al 2019).…”
Section: Introductionmentioning
confidence: 99%
“…Recent years have seen a resurgence of numerical methods based on (deep) artificial neural networks (ANNs) for solving ordinary (ODEs) and partial differential equations (PDEs) (Budkina et al, 2016;Chen et al, 2018;E et al, 2017;E and Yu, 2018;Khodayi-Mehr and Zavlanos, 2020;Long et al, 2019Long et al, , 2018Mall and Chakraverty, 2016;Raissi et al, 2019;Sirignano and Spiliopoulos, 2018;Yadav et al, 2015). These so-called neural solvers revisit an idea with origins more than 20 years ago (Aarts and Van Der Veer, 2001;Dissanayake and Phan-Thien, 1994;Lagaris et al, 1998Lagaris et al, , 2000Lee and Kang, 1990;Meade Jr and Fernandez, 1994) in the new light of the ongoing advances in machine learning (ML) technologies, the availability of deep learning software (Abadi et al, 2016;Al-Rfou et al, 2016;Haghighat and Juanes, 2021;Lu et al, 2021;Paszke et al, 2017), and the capabilities of modern computing hardware (Jouppi et al, 2018;LeCun, 2019).…”
Section: Introductionmentioning
confidence: 99%