2021
DOI: 10.48550/arxiv.2105.01843
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Promoting global stability in data-driven models of quadratic nonlinear dynamics

Alan A. Kaptanoglu,
Jared L. Callaham,
Christopher J. Hansen
et al.

Abstract: Modeling realistic fluid and plasma flows is computationally intensive, motivating the use of reducedorder models for a variety of scientific and engineering tasks. However, it is challenging to characterize, much less guarantee, the global stability (i.e., long-time boundedness) of these models. The seminal work of Schlegel and Noack [1] provided a theorem outlining necessary and sufficient conditions to ensure global stability in systems with energy-preserving, quadratic nonlinearities, with the goal of eval… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 90 publications
0
4
0
Order By: Relevance
“…The sparse identification of nonlinear dynamics algorithm [18] learns dynamical systems models with as few terms from a library of candidate terms as are needed to describe the training data. There are several formulations involving different loss terms and optimization algorithms that promote additional physical notions, such as stability [99] and energy conservation [100]. Stability promoting loss functions based on notions of Lyapunov stability have also been incorporated into autoencoders, with impressive results on fluid systems [101].…”
Section: The Loss Functionmentioning
confidence: 99%
See 1 more Smart Citation
“…The sparse identification of nonlinear dynamics algorithm [18] learns dynamical systems models with as few terms from a library of candidate terms as are needed to describe the training data. There are several formulations involving different loss terms and optimization algorithms that promote additional physical notions, such as stability [99] and energy conservation [100]. Stability promoting loss functions based on notions of Lyapunov stability have also been incorporated into autoencoders, with impressive results on fluid systems [101].…”
Section: The Loss Functionmentioning
confidence: 99%
“…One approach is to explicitly add constraints to the optimization, for example that certain coefficients must be non-negative, or that other coefficients must satisfy a specified algebraic relationship with each other. Depending on the given machine learning architecture, it may be possible to enforce energy conservation [100] or stability constraints [99] in this way. Another approach involves employing custom optimization algorithms required to minimize the physically motivated loss functions above, which are often non-convex.…”
Section: The Optimization Algorithmmentioning
confidence: 99%
“…In this work, the simple hard-thresholded least-squares algorithm proposed in [47] was however found to be sufficient. Note finally that, since its introduction, numerous extensions of SINDy have been proposed, see for instance [56,[59][60][61][62][63][64][65][66][67][68] and references therein.…”
Section: Sindy -System Identificationmentioning
confidence: 99%
“…Indeed, the results of integrating the model derived by SINDy outperformed those from standard GROM. A recent innovation 359 , trapping SINDy, identifies a dynamics with a trapping region, i.e., guarantees boundedness of the solution 114 . This innovation is particularly important for higher-dimensional dynamics, where sparse identification is prone to give rise to unbounded solutions otherwise.…”
Section: G System Identification Approachesmentioning
confidence: 99%