2021
DOI: 10.48550/arxiv.2109.13359
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Lyapunov-Net: A Deep Neural Network Architecture for Lyapunov Function Approximation

Abstract: We develop a versatile deep neural network architecture, called Lyapunov-Net, to approximate Lyapunov functions of dynamical systems in high dimensions. Lyapunov-Net guarantees positive definiteness, and thus it can be easily trained to satisfy the negative orbital derivative condition, which only renders a single term in the empirical risk function in practice. This significantly reduces the number of hyper-parameters compared to existing methods. We also provide theoretical justifications on the approximatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 28 publications
0
4
0
Order By: Relevance
“…By the universal approximation theorem, there exists a neural network φ approximating f such that f (x, κ(x) − φ(x, κ(x) ∞ < β M on the D \ { x < ε}. As in (10), the following holds…”
Section: Discussionmentioning
confidence: 96%
See 1 more Smart Citation
“…By the universal approximation theorem, there exists a neural network φ approximating f such that f (x, κ(x) − φ(x, κ(x) ∞ < β M on the D \ { x < ε}. As in (10), the following holds…”
Section: Discussionmentioning
confidence: 96%
“…As both the right-hand side of a differential equation and the Lyapunov function can be regarded as nonlinear (or linear) functions, mapping from the state spaces to some vector spaces, a natural question is: can we use neural networks to approximate the dynamics and find a Lyapunov function to attain a larger estimate of the ROA, compared with widely used linear-quadratic regulator (LQR) and sum of squares (SOS) methods [8,9]? Various applications of neural networks for system identification and Lyapunov stability have been proposed recently [5,10]. For instance, Wang et al [11] use a recurrent neural network to generate the state-space representation of an unknown dynamical system.…”
Section: Introductionmentioning
confidence: 99%
“…Methodology Certificate Generic Affine Hybrid Markovian Model-Free RL Supervised L&NP LF/CLF BF/CBF [29] ✓ ✓ ✓…”
Section: Model Typementioning
confidence: 99%
“…While both the papers [17], [18] empirically verify their findings, neither provide hardware experimental validation of their methods. Neural certificates, which use neural networks to construct safety barrier certificates, have been demonstrated in [19]- [22]. These neural certificates provide a datadriven approach to learning-based controllers and provide formal proofs of correctness.…”
Section: Introductionmentioning
confidence: 99%