2017
DOI: 10.1109/access.2017.2699741
|View full text |Cite
|
Sign up to set email alerts
|

Structure Preserving Non-negative Feature Self-Representation for Unsupervised Feature Selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
20
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
9

Relationship

2
7

Authors

Journals

citations
Cited by 32 publications
(21 citation statements)
references
References 21 publications
0
20
0
Order By: Relevance
“…Each sample is given a weight that reflects its representational ability. The most representative samples are selected as dictionary atoms [2328]. …”
Section: The Proposed Methodsmentioning
confidence: 99%
“…Each sample is given a weight that reflects its representational ability. The most representative samples are selected as dictionary atoms [2328]. …”
Section: The Proposed Methodsmentioning
confidence: 99%
“…In essence, FS is a dimensionality reduction process, and the features themselves do not change. A good FS algorithm can effectively reduce the original feature set dimension, has low computational complexity, and can improve the effectiveness of subsequent classification and clustering [31]. Similar to FS, FE also reduces the dimension of the original feature set.…”
Section: A Feature Engineeringmentioning
confidence: 99%
“…Another line of work [23]- [32] is to select those features which can be used to well reconstruct or approximate the whole data set. Besides, it has also been verified that the local structure of data is also vital important for unsupervised features selection [16], [33]- [35]. Most recently, several techniques have also been introduced to further improve unsupervised feature selection, such as the adaptive graph learning [36], [37], the ensemble of weak partitions [38].…”
Section: Related Work a Unsupervised Feature Selectionmentioning
confidence: 99%