2020
DOI: 10.3934/jcd.2020003
|View full text |Cite
|
Sign up to set email alerts
|

Approximation of Lyapunov functions from noisy data

Abstract: Methods have previously been developed for the approximation of Lyapunov functions using radial basis functions. However these methods assume that the evolution equations are known. We consider the problem of approximating a given Lyapunov function using radial basis functions where the evolution equations are not known, but we instead have sampled data which is contaminated with noise. We propose an algorithm in which we first approximate the underlying vector field, and use this approximation to then approxi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 27 publications
(19 citation statements)
references
References 24 publications
(42 reference statements)
0
19
0
Order By: Relevance
“…The training and prediction results are shown in the following table with R 1 the RMSE corresponding to 50,000 points with initial conditions (0.5, 1.5, 2.5). i. Convergence results that characterize the error estimates of the difference between a dynamical system and its approximation from data using kernel methods can be found in [7,18].…”
Section: Example 4 (The Lorenz System)mentioning
confidence: 99%
See 1 more Smart Citation
“…The training and prediction results are shown in the following table with R 1 the RMSE corresponding to 50,000 points with initial conditions (0.5, 1.5, 2.5). i. Convergence results that characterize the error estimates of the difference between a dynamical system and its approximation from data using kernel methods can be found in [7,18].…”
Section: Example 4 (The Lorenz System)mentioning
confidence: 99%
“…On the other hand, RNN-LSTM were observed to be accurate for estimating Lyapunov exponents but not as good as reservoir computing for predictions (see [14] for a survey). Although Reproducing Kernel Hilbert Spaces (RKHS) [16] have provided strong mathematical foundations for analyzing dynamical systems [5,6,8,20,24,7,18,4,22,23,2], the accuracy of these emulators depends on the kernel and the problem of selecting a good kernel has received less attention.…”
Section: Introductionmentioning
confidence: 99%
“…Learning safety certificates A wide body of work addresses learning Lyapunov [9,13,7,28,24,6,27] and barrier [37,29,12] functions, as well as contraction metrics [33,23,32] and contracting vector fields [31,14] from data. While the generality and strength of guarantees provided vary (see the literature review of Boffi et al [4] for a detailed exposition), all of the aforementioned works consider nominally specified systems without uncertainty, whereas our approach explicitly considers perturbations that can capture model uncertainty and process noise.…”
Section: Related Workmentioning
confidence: 99%
“…The first two issues will cause trouble for existing Lyapunov-based ROA analysis methods using semidefinite programming (Yin et al, 2021;Hu et al, 2020;Jin and Lavaei, 2020;Aydinoglu et al, 2021) or mixed-integer programs (Chen et al, 2020(Chen et al, , 2021Dai et al, 2021). Due to the last issue, the methods of Lyapunov neural networks (Richards et al, 2018;Chang et al, 2019; or other stability certificate learning methods (Kenanian et al, 2019;Giesl et al, 2020;Ravanbakhsh and Sankaranarayanan, 2019) may also be not applicable since these methods typically require the control action to depend on the current state. Our goal is to develop a ROA analysis method which can address the above three issues simultaneously.…”
Section: Introductionmentioning
confidence: 99%