“…3 Specifically, when the underlying process consists of independent coordinates, the leading d eigenvectors (except the trivial), which are a local canonical/intrinsic coordinate system for the manifold [33], recover d proxies for the underlying process coordinates up to a monotonic scaling [29]. In other words, they are empirical solutions to the inverse problem described by the differential equation in (12). In addition, the eigenvectors are independent in case the manifold is flat [11].…”
Section: Laplace Operatormentioning
confidence: 99%
“…It implies that two measurements are similar if they "see" the reference measurements in the same way. Furthermore, it is shown in [29] and [12] that the elements of the extended kernel are proportional to a Gaussian defined similarly to (22) with the corresponding Mahalanobis distances between pairs of new measurements.…”
Section: Sequential Processingmentioning
confidence: 99%
“…Specifically, the eigenvectors provide an embedding (or a parametrization) of the underlying processes on the intrinsic manifold. We remark that the construction of the graph is implemented using a reference set of measurements [12,13]. This allows for the parameterization of a training signal in advance, and in turn, for the extension of the parameterization to newly acquired signal samples in a sequential manner.…”
In a broad range of natural and real-world dynamical systems, measured signals are controlled by underlying processes or drivers. As a result, these signals exhibit highly redundant representations, while their temporal evolution can often be compactly described by dynamical processes on a low-dimensional manifold. In this paper, we propose a graph-based method for revealing the low-dimensional manifold and inferring the processes. This method provides intrinsic models for measured signals, which are noise resilient and invariant under different random measurements and instrumental modalities. Such intrinsic models may enable mathematical calibration of complex measurements and build an empirical geometry driven by the observations, which is especially suitable for applications without a priori knowledge of models and solutions. We exploit the temporal dynamics and natural small perturbations of the signals to explore the local tangent spaces of the low-dimensional manifold of empirical probability densities. This information is used to define an intrinsic Riemannian metric, which in turn gives rise to the construction of a graph that represents the desired low-dimensional manifold. Such a construction is equivalent to an inverse problem, which is formulated as a nonlinear differential equation and is solved empirically through eigenvectors of an appropriate Laplace operator. We examine our method on two nonlinear filtering applications: a nonlinear and non-Gaussian tracking problem as well as a non-stationary hidden Markov chain scheme. The experimental results demonstrate the power of our theory by extracting the underlying processes, which were measured through different nonlinear instrumental conditions, in an entirely data-driven nonparametric way.
“…3 Specifically, when the underlying process consists of independent coordinates, the leading d eigenvectors (except the trivial), which are a local canonical/intrinsic coordinate system for the manifold [33], recover d proxies for the underlying process coordinates up to a monotonic scaling [29]. In other words, they are empirical solutions to the inverse problem described by the differential equation in (12). In addition, the eigenvectors are independent in case the manifold is flat [11].…”
Section: Laplace Operatormentioning
confidence: 99%
“…It implies that two measurements are similar if they "see" the reference measurements in the same way. Furthermore, it is shown in [29] and [12] that the elements of the extended kernel are proportional to a Gaussian defined similarly to (22) with the corresponding Mahalanobis distances between pairs of new measurements.…”
Section: Sequential Processingmentioning
confidence: 99%
“…Specifically, the eigenvectors provide an embedding (or a parametrization) of the underlying processes on the intrinsic manifold. We remark that the construction of the graph is implemented using a reference set of measurements [12,13]. This allows for the parameterization of a training signal in advance, and in turn, for the extension of the parameterization to newly acquired signal samples in a sequential manner.…”
In a broad range of natural and real-world dynamical systems, measured signals are controlled by underlying processes or drivers. As a result, these signals exhibit highly redundant representations, while their temporal evolution can often be compactly described by dynamical processes on a low-dimensional manifold. In this paper, we propose a graph-based method for revealing the low-dimensional manifold and inferring the processes. This method provides intrinsic models for measured signals, which are noise resilient and invariant under different random measurements and instrumental modalities. Such intrinsic models may enable mathematical calibration of complex measurements and build an empirical geometry driven by the observations, which is especially suitable for applications without a priori knowledge of models and solutions. We exploit the temporal dynamics and natural small perturbations of the signals to explore the local tangent spaces of the low-dimensional manifold of empirical probability densities. This information is used to define an intrinsic Riemannian metric, which in turn gives rise to the construction of a graph that represents the desired low-dimensional manifold. Such a construction is equivalent to an inverse problem, which is formulated as a nonlinear differential equation and is solved empirically through eigenvectors of an appropriate Laplace operator. We examine our method on two nonlinear filtering applications: a nonlinear and non-Gaussian tracking problem as well as a non-stationary hidden Markov chain scheme. The experimental results demonstrate the power of our theory by extracting the underlying processes, which were measured through different nonlinear instrumental conditions, in an entirely data-driven nonparametric way.
“…Technically, this kernel form has been considered in several related works such as [30,14,21,19,15,27] and references therein. These works consider the relations between the analyzed dataset and a reference set, which typically is significantly smaller than the dataset.…”
“…Another approach, which is presented in [4,15], is to analyze the data by considering their relations to a given reference set, which can either be part of the input data, or designed for specific applications. Therefore, instead of representing pairwise similarities within the data, the kernel in these cases takes an asymmetric form consisting of relations between data points and reference points.…”
In this short letter we present the construction of a bi-stochastic kernel p
for an arbitrary data set X that is derived from an asymmetric affinity
function {\alpha}. The affinity function {\alpha} measures the similarity
between points in X and some reference set Y. Unlike other methods that
construct bi-stochastic kernels via some convergent iteration process or
through solving an optimization problem, the construction presented here is
quite simple. Furthermore, it can be viewed through the lens of out of sample
extensions, making it useful for massive data sets.Comment: 5 pages. v2: Expanded upon the first paragraph of subsection 2.1. v3:
Minor changes and edits. v4: Edited comments and added DO
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.