2017
DOI: 10.1016/j.neucom.2016.08.124
|View full text |Cite
|
Sign up to set email alerts
|

Iterative sparsity score for feature selection and its extension for multimodal data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 25 publications
0
2
0
Order By: Relevance
“…After calculating the score for each feature, they are sorted in the ascending order of SparseScore to select the relevant ones. In the classification experiments, Liu et al have demonstrated that this score outperforms other methods in most cases, especially for multi-class problems [36].…”
Section: Feature Selectionmentioning
confidence: 99%
See 1 more Smart Citation
“…After calculating the score for each feature, they are sorted in the ascending order of SparseScore to select the relevant ones. In the classification experiments, Liu et al have demonstrated that this score outperforms other methods in most cases, especially for multi-class problems [36].…”
Section: Feature Selectionmentioning
confidence: 99%
“…Liu et al extend the unsupervised sparsity score to supervised context by utilizing the class label information [36,37]. Let denotes the ℎ feature of ℎ instance in class , ̂ is the element of sparse similarity matrix which is constructed within the class , is a Ndimensional vector with =1, if belongs to the class and 0 otherwise.…”
Section: Feature Selectionmentioning
confidence: 99%