2019
DOI: 10.1007/s00332-019-09548-1
|View full text |Cite|
|
Sign up to set email alerts
|

Spatiotemporal Pattern Extraction by Spectral Analysis of Vector-Valued Observables

Abstract: We present a data-driven framework for extracting complex spatiotemporal patterns generated by ergodic dynamical systems. Our approach, called Vector-valued Spectral Analysis (VSA), is based on an eigendecomposition of a kernel integral operator acting on a Hilbert space of vector-valued observables of the system, taking values in a space of functions (scalar fields) on a spatial domain. This operator is constructed by combining aspects of the theory of operator-valued kernels for machine learning with delay-c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
32
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2

Relationship

3
6

Authors

Journals

citations
Cited by 27 publications
(32 citation statements)
references
References 72 publications
(219 reference statements)
0
32
0
Order By: Relevance
“…, }, (i) the eigenvalues λ i,n converge to λ i ; (ii) the RKHS functions ψ i,n converge, up to multiplication by a constant phase factor, to ψ i in C(U) norm; and (iii) each of the expansion coefficients α i,n (τ) converges to α i (τ). The first two of these claims are a consequence of the following lemma, which is based on [46,Theorem 15], [55,Corollary 2], and [56,Theorem 7].…”
Section: Kaf Sample Errormentioning
confidence: 99%
“…, }, (i) the eigenvalues λ i,n converge to λ i ; (ii) the RKHS functions ψ i,n converge, up to multiplication by a constant phase factor, to ψ i in C(U) norm; and (iii) each of the expansion coefficients α i,n (τ) converges to α i (τ). The first two of these claims are a consequence of the following lemma, which is based on [46,Theorem 15], [55,Corollary 2], and [56,Theorem 7].…”
Section: Kaf Sample Errormentioning
confidence: 99%
“…That said, there have been some progress in understanding the RC-ESNs, in particular by modeling the network itself as a dynamical system [65,53]. Such efforts, for example those aimed at understanding the echo states that are learned in the RC-ESN's reservoir, might benefit from recent advances in dynamical systems theory [66,67,68,69,70,71].…”
Section: Discussionmentioning
confidence: 99%
“…The eigenvectors φ j then provide representations of the basis elements φ j,N (see Section V C). Elsewhere, we have demonstrated the feasibility of this implementation for computing eigenfunctions from high-dimensional datasets of moderate sample number, (d, N ) = O(10 6 , 10 4 ) [51], or datasets of moderate dimension and high sample number, (d, N ) = O(10 2 , 10 6 ) [26]. In the latter case, it should be possible to speed up the kernel matrix calculation using tree-based [52] or randomized [53] approximate nearest-neighbor algorithms, though we have not explored such options in the present work.…”
Section: Data-driven Basismentioning
confidence: 98%
“…of L 2 (µ), consisting of eigenfunctions of G µ . See [25,26] for proofs of these results, which make use of spectral convergence results for kernel integral operators established in [13]. Following [25], we construct the kernelsp N starting from an unnormalized kernelk N : M × M → R, and applying to that kernel a normalization procedure to render it Markovian.…”
Section: Kernels and Their Associated Eigenfunction Basesmentioning
confidence: 99%