2017
DOI: 10.1088/1741-2552/aa61bb
|View full text |Cite
|
Sign up to set email alerts
|

Dimensionality reduction based on distance preservation to local mean for symmetric positive definite matrices and its application in brain–computer interfaces

Abstract: In this paper, we propose a nonlinear dimensionality reduction algorithm for the manifold of Symmetric Positive Definite (SPD) matrices that considers the geometry of SPD matrices and provides a low dimensional representation of the manifold with high class discrimination. The proposed algorithm, tries to preserve the local structure of the data by preserving distance to local mean (DPLM) and also provides an implicit projection matrix. DPLM is linear in terms of the number of training samples and may use the … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
24
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(24 citation statements)
references
References 34 publications
(83 reference statements)
0
24
0
Order By: Relevance
“…In parallel, geometry-aware dimensionality reduction (see section 3.1) inspired by the Riemannian framework is currently intensely investigated. Works related to BCI data include [126][127][128][129][130][131]. A relevant work, which can be readily borrowed from the computer vision community, is [132,133].…”
Section: A Review Of Studies Applying Riemannian Geometry To Eegmentioning
confidence: 99%
“…In parallel, geometry-aware dimensionality reduction (see section 3.1) inspired by the Riemannian framework is currently intensely investigated. Works related to BCI data include [126][127][128][129][130][131]. A relevant work, which can be readily borrowed from the computer vision community, is [132,133].…”
Section: A Review Of Studies Applying Riemannian Geometry To Eegmentioning
confidence: 99%
“…Furthermore, the variance is sensitive to outliers and noisy points, as it is well known. In order to remediate to the last drawback, [6] has proposed to estimate M for each Ck as the center of mass of a number of points in its neighborhood. Such procedure may indeed be resistant to outliers, still, there is no guarantee to preserve the original geometrical structure, in that distinct neighborhoods of points may be projected in arbitrary positions of the embedding manifold.…”
Section: A Closed-form Unsupervised Geometry-aware Dimensionality Redmentioning
confidence: 99%
“…Unsupervised refers to the fact that we do not assume knowledge of anything else but the input points, for instance, class labels, often available for a training data set in machine learning applications. Geometry-aware has been used recently in the literature to suggest that the embedding should respect as much as possible the geometrical structure of the data points [3][4][5][6]. The problem is relevant for at least two reasons:…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…To deal with this problem, more advanced methods, such as the Generalized Multi-view Analysis (GMA) [5] and the Multi-view Modular Discriminant Analysis (MvMDA) [30], were proposed thereafter to render improved classification performance by taking into consideration either intra-view or inter-view discriminant information. Despite promising results obtained on some applications, these methods are only capable of discovering the intrinsic geometric structure of data lying on linear or near-linear manifolds [25], [26], and their performance cannot be guaranteed when the data is non-linearly embedded in high dimensional observation space or suffers from heavy outliers [27]- [29].…”
Section: Introductionmentioning
confidence: 99%