2022
DOI: 10.1109/tnnls.2021.3083763
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Feature Selection via Orthogonal Basis Clustering and Local Structure Preserving

Abstract: Due to the "curse of dimensionality" issue, how to discard redundant features and select informative features in high-dimensional data has become a critical problem, and there are many researches dedicated to solving this problem. Unsupervised feature selection technique, which doesn't require any prior category information to conduct with, has gained a prominent place in pre-processing high-dimensional data among all feature selection techniques, and it has been applied to many neural networks and learning sy… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 24 publications
(8 citation statements)
references
References 41 publications
0
7
0
Order By: Relevance
“…The calculation of a similarity matrix is realized by first constructing a graph. And spectral-based methods can be roughly divided into two types [23]: one to use a predefined graph, while the other learn an adaptive graph during the optimization.…”
Section: Spectral-based Embedded Unsupervised Feature Selection Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The calculation of a similarity matrix is realized by first constructing a graph. And spectral-based methods can be roughly divided into two types [23]: one to use a predefined graph, while the other learn an adaptive graph during the optimization.…”
Section: Spectral-based Embedded Unsupervised Feature Selection Methodsmentioning
confidence: 99%
“…This inevitably leads to sensitivity to noise features and outlier data. Secondly, many EUFS methods construct a non-convex optimization problem with a series of arXiv:2309.06202v1 [cs.CV] 12 Sep 2023 subproblems to solve in each iteration [22], [23], [31], [32]. Considering the computational complexity of them is usually cubic to the number of samples (or the number of features), the running time of solving the whole problem can sometimes be unacceptable.…”
Section: Introductionmentioning
confidence: 99%
“…To improve the orthogonal basis clustering based feature selection method in [18], Lim and Kim [19] incorporated a pairwise dependence term into the objective function. And Lin et al [20] utilized a locality preserving term and a graph regularization term in the orthogonal decomposition based formulation. These methods [15]- [20] are typically designed for single-view unsupervised feature section, but lack the ability to leverage the rich and complementary information in data with multiple views.…”
Section: Unsupervised Feature Selectionmentioning
confidence: 99%
“…And Lin et al [20] utilized a locality preserving term and a graph regularization term in the orthogonal decomposition based formulation. These methods [15]- [20] are typically designed for single-view unsupervised feature section, but lack the ability to leverage the rich and complementary information in data with multiple views.…”
Section: Unsupervised Feature Selectionmentioning
confidence: 99%
See 1 more Smart Citation