2018
DOI: 10.1109/access.2018.2882777
|View full text |Cite
|
Sign up to set email alerts
|

An Emergent Space for Distributed Data With Hidden Internal Order Through Manifold Learning

Abstract: Manifold-learning techniques are routinely used in mining complex spatiotemporal data to extract useful, parsimonious data representations/parametrizations; these are, in turn, useful in nonlinear model identification tasks. We focus here on the case of time series data that can ultimately be modelled as a spatially distributed system (e.g. a partial differential equation, PDE), but where we do not know the space in which this PDE should be formulated. Hence, even the spatial coordinates for the distributed sy… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
19
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
3

Relationship

5
5

Authors

Journals

citations
Cited by 19 publications
(19 citation statements)
references
References 48 publications
0
19
0
Order By: Relevance
“…It has been shown that with a Chung-Lu network (30, 31) these oscillators are drawn to an attractive limit cycle along which their states can be described by two parameters: their applied current I app and their degree, κ. These two heterogeneities can also be described by two diffusion map coordinates, φ 1 and φ 2 (Kemeth et al, 2018). Plotting the potential, V, for a single period of the limit cycle produces the stack of surfaces shown in Figure 15B.…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…It has been shown that with a Chung-Lu network (30, 31) these oscillators are drawn to an attractive limit cycle along which their states can be described by two parameters: their applied current I app and their degree, κ. These two heterogeneities can also be described by two diffusion map coordinates, φ 1 and φ 2 (Kemeth et al, 2018). Plotting the potential, V, for a single period of the limit cycle produces the stack of surfaces shown in Figure 15B.…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…This task requires a new notion of distribution distance, which is additionally informed by physics at the microscopic level. Finally, we note that one could even infer the independent variables (the appropriate parameterizations of space and even time) for coarse-grained PDEs in systems where there is no a priori notion of physical space or time for modeling [37][38][39].…”
Section: Discussionmentioning
confidence: 99%
“…In contrast, here we seek to learn an optimal metric given the available data. The importance of the metric has long been recognized in data science and machine learning communities [26,28,4,21,14]. In fact, our method draws on the work of Xing et al [28] who proposed a metric learning algorithm for semi-supervised clustering of data in finite dimensions.…”
Section: Related Workmentioning
confidence: 99%