2019
DOI: 10.1016/j.jcp.2019.04.015
|View full text |Cite
|
Sign up to set email alerts
|

Manifold learning for parameter reduction

Abstract: Large scale dynamical systems (e.g. many nonlinear coupled differential equations) can often be summarized in terms of only a few state variables (a few equations), a trait that reduces complexity and facilitates exploration of behavioral aspects of otherwise intractable models. High model dimensionality and complexity makes symbolic, pen-and-paper model reduction tedious and impractical, a difficulty addressed by recently developed frameworks that computerize reduction. Symbolic work has the benefit, however,… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
40
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 37 publications
(40 citation statements)
references
References 18 publications
(2 reference statements)
0
40
0
Order By: Relevance
“…This uncertainty is even more significant with respect to the current situation regarding the asymptomatics which, while concretely studied in some cases [26] , presents tremendous variability [43] between studies. As indicated above, the issue at hand is crucial from the point of view of modeling, both regarding issues of dimension reduction [46] , as well as those of identifiability of the models [44] .…”
Section: Methods and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…This uncertainty is even more significant with respect to the current situation regarding the asymptomatics which, while concretely studied in some cases [26] , presents tremendous variability [43] between studies. As indicated above, the issue at hand is crucial from the point of view of modeling, both regarding issues of dimension reduction [46] , as well as those of identifiability of the models [44] .…”
Section: Methods and Resultsmentioning
confidence: 99%
“…Actually, we have shown that it is impossible to determine uniquely all 9 model parameters from a reliable set of given data. This is indeed an example of dimension reduction: this represents the crucial aspect of whether a given model outcome depends not on individual parameters alone (except for one of the model parameters) but rather on expressions formed by suitable (irreducible) combinations of parameters; for a recent, data-driven example, see [46] . In addition to the existence of the above prohibitive result, many of the needed data are unavailable.…”
Section: A Computational Algorithm Based On a Rigorous Mathematical Rmentioning
confidence: 99%
“…A broad framework leverages unsupervised learning approaches to learn low-complexity representations of physical process observations. In many cases where the underlying process features a small number of degrees of freedom, it is shown that nonlinear manifold learning algorithms are able to discern these degrees of freedoms as the component dimensions of low-dimensional nonlinear manifold embeddings, which preserve the underlying geometry of the original data space [Yair et al, 2017, Dsilva et al, 2018, Holiday et al, 2019. It can be seen that the embed-ding coordinates show relation to known physical quantities.…”
Section: Interpretation Tools For Scientific Outcomesmentioning
confidence: 99%
“…Finding these reduced sets of parameters often requires insight, along with trial and error. Here, we present a methodology for discovering such effective parameters in a data-driven way via a modification of diffusion maps, our manifold learning technique of choice in this paper (Holiday et al, 2019). A strong motivation for this work comes from the determination of explicit dimensionless parameters from the (possibly long) list of dimensional parameters of a physical model.…”
Section: Identifying "Effective" Parametersmentioning
confidence: 99%