2022
DOI: 10.1109/tnnls.2020.3043362
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Feature Selection With Constrained ℓ₂,₀-Norm and Optimized Graph

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 38 publications
(8 citation statements)
references
References 28 publications
0
8
0
Order By: Relevance
“…After establishing such a model, we can follow the solution method of Nie et al 25 to solve Equation (13). we can obtain an index vector group boldq of selected elements, which is derived from the k maximal diagonal elements of false(boldAfalse)1boldB.…”
Section: The Methodologymentioning
confidence: 99%
See 1 more Smart Citation
“…After establishing such a model, we can follow the solution method of Nie et al 25 to solve Equation (13). we can obtain an index vector group boldq of selected elements, which is derived from the k maximal diagonal elements of false(boldAfalse)1boldB.…”
Section: The Methodologymentioning
confidence: 99%
“…Still, according to the calculation result of Equation ( 14), its convergence positively correlates with the change of U. For other variables, according to the convergence analysis of Nie et al 25 and Wang et al, 26 the value of the objective function will decrease or remain unchanged at each step of optimizing a variable. Therefore, our alternating update rule can let the objective function value fall monotonically and guarantee the objective function convergence.…”
Section: Complexity and Convergence Analysismentioning
confidence: 98%
“…Nie et al. [23] also suggested an unsupervised feature selection method with a row sparsity constraint and optimized graph. Controllable sparse learning [24] was employed to eliminate noise in the completed sample correlations and feature dependency‐based unsupervised feature selection approach.…”
Section: Introductionmentioning
confidence: 99%
“…Wang et al [22] proposed a robust optimal graph clustering framework where a robust graph is constructed with adaptive neighbors to each data sample. Nie et al [23] also suggested an unsupervised feature selection method with a row sparsity constraint and optimized graph. Controllable sparse learning [24] was employed to eliminate noise in the completed sample correlations and feature dependency-based unsupervised feature selection approach.…”
Section: Introductionmentioning
confidence: 99%
“…However, the above three algorithms just solve the relaxed problem from the original l 2,0 -norm problem, which has a tendency to weaken the performance. Nie et al proposed unsupervised feature selection with constrained l 2,0 -norm and optimized graph (RSOGFS) [28]. RSOGFS tackles the l 2,0 -norm problem directly so as to choose the needed features at a time instead of one by one.…”
Section: Introductionmentioning
confidence: 99%