2019 IEEE 29th International Workshop on Machine Learning for Signal Processing (MLSP) 2019
DOI: 10.1109/mlsp.2019.8918875
|View full text |Cite
|
Sign up to set email alerts
|

Visualizing High Dimensional Dynamical Processes

Abstract: Manifold learning techniques for dynamical systems and time series have shown their utility for a broad spectrum of applications in recent years. While these methods are effective at learning a lowdimensional representation, they are often insufficient for visualizing the global and local structure of the data. In this paper, we present DIG (Dynamical Information Geometry), a visualization method for multivariate time series data that extracts an information geometry from a diffusion framework. Specifically, w… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
1

Relationship

4
3

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 25 publications
0
9
0
Order By: Relevance
“…Many machine learning methods depend on some measure of pairwise similarity (which is usually unsupervised) including dimensionality reduction methods [17], [18], [19], [20], [21], [22], [23], spectral clustering [24], and any method involving the kernel trick such as SVM [25] and kernel PCA [26]. Random forest proximities can be used to extend many of these problems to a supervised setting and have been used for data visualization [27], [28], [29], [30], [31], outlier detection [30], [32], [33], [34], and data imputation [35], [36], [37], [38].…”
Section: Introductionmentioning
confidence: 99%
“…Many machine learning methods depend on some measure of pairwise similarity (which is usually unsupervised) including dimensionality reduction methods [17], [18], [19], [20], [21], [22], [23], spectral clustering [24], and any method involving the kernel trick such as SVM [25] and kernel PCA [26]. Random forest proximities can be used to extend many of these problems to a supervised setting and have been used for data visualization [27], [28], [29], [30], [31], outlier detection [30], [32], [33], [34], and data imputation [35], [36], [37], [38].…”
Section: Introductionmentioning
confidence: 99%
“…On the other extreme (γ = + 1), the resulting information distance yields an L 2 distance between localized diffusion energy potentials given by U t x ðÞ ¼ log p t x ðzÞ I , as discussed by Moon et al 10 . There, as well as in other work 40,41 , it has been shown that this potential distance is amenable to a low-dimensional embedding that captures and visually accentuates emergent global and local structures in the data. Therefore, the PHATE method is based on embedding potential distances directly into two-or three-dimensional coordinates via a stress-minimizing optimization procedure provided by MDS.…”
Section: Multiscale Phate Algorithm the Multiscale Phate Algorithm Is...mentioning
confidence: 73%
“…The potential distance, D t , is computed by log-transforming entries of P t and then computing the pairwise distances between the log-transformed columns. Other works have shown this diffusion potential is capable of mapping out noisy and complex data in biological and other contexts [1, 16, 19]. The potentially complex, nonlinear structural relationships are finally mapped to a 2- or 3-dimensional embedding, G , via metric MDS.…”
Section: Resultsmentioning
confidence: 99%
“…Output: The RF-PHATE embedding, G Train the random forest (RF) on M Generate RF-GAP [14] proximities ( P ) from RF Row-normalize P to form the initial diffusion operator P Apply damping [15] to form Perform t steps as selected using VNE to form P t β Calculate the potential distance [1, 16] D t from P t β by applying the log transform of P t β and then computing pairwise distances between columns Form the embedding G by applying MDS to D t …”
Section: Methodsmentioning
confidence: 99%