2022
DOI: 10.1016/j.neucom.2022.09.074
|View full text |Cite
|
Sign up to set email alerts
|

Robust unsupervised feature selection via sparse and minimum-redundant subspace learning with dual regularization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(1 citation statement)
references
References 45 publications
0
0
0
Order By: Relevance
“…Guo [19] used Unsupervised Feature Selection With Adaptive Structure (FSASL) method to eliminate the impact of noise, redundancy, and missing values on the original signal, improving the effectiveness of feature selection. Zeng [20] eliminated the influence of noise or outlier in real fault data through Robust Unsupervised Feature Selection (RUFS) algorithm. Zhu [21] combined the construction of similarity matrices with the feature selection process through the Graph Learning Unsupervised Feature Selection (GLUFS) algorithm, improving the effectiveness and superiority of the feature selection process.…”
Section: Introductionmentioning
confidence: 99%
“…Guo [19] used Unsupervised Feature Selection With Adaptive Structure (FSASL) method to eliminate the impact of noise, redundancy, and missing values on the original signal, improving the effectiveness of feature selection. Zeng [20] eliminated the influence of noise or outlier in real fault data through Robust Unsupervised Feature Selection (RUFS) algorithm. Zhu [21] combined the construction of similarity matrices with the feature selection process through the Graph Learning Unsupervised Feature Selection (GLUFS) algorithm, improving the effectiveness and superiority of the feature selection process.…”
Section: Introductionmentioning
confidence: 99%