2019
DOI: 10.1016/j.neucom.2018.10.027
|View full text |Cite
|
Sign up to set email alerts
|

Semi-supervised feature selection analysis with structured multi-view sparse regularization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(8 citation statements)
references
References 32 publications
0
8
0
Order By: Relevance
“…For further considering the importance of different views, several multiview schemes have been proposed to learn weights for different views [12,14,20,21,33]. In [33], Xu et al proposed a weighted multiview clustering with feature selection method (WMCFS) to improve the clustering accuracy via designing two weighting schemes.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…For further considering the importance of different views, several multiview schemes have been proposed to learn weights for different views [12,14,20,21,33]. In [33], Xu et al proposed a weighted multiview clustering with feature selection method (WMCFS) to improve the clustering accuracy via designing two weighting schemes.…”
Section: Related Workmentioning
confidence: 99%
“…Compared with feature extraction, feature selection methods can preserve the physical meaning of original features. According to the label availability, the feature selection methods are classified into supervised ones [3,16], semi-supervised ones [22,35], and unsupervised ones [11,30]. Among the three kinds of methods, the supervised methods are most likely to select the better discriminative feature subset because they can utilize the label information and consider the correlation between labels and features.…”
Section: Introductionmentioning
confidence: 99%
“…Based on the existence or absence of labels, the existing feature selection methods can be roughly divided into three categories: supervised feature selection (Taher et al 2019), unsupervised feature selection (Shi et al 2018) and semi-supervised feature selection (Wang and Wang 2017). Supervised feature selection can take advantage of feature information and label information to achieve high accuracy when the data sets all have labels.…”
Section: Introductionmentioning
confidence: 99%
“…Certainly, the research work on feature selection based on semi-supervised is also studied. Shi et al [10] developed the structured multi-view hessian sparse semi-supervise feature selection framework, and the feature selection performance is improved by the proposed algorithm. Yu et al [11] proposed an adaptive semi-supervised feature selection, and the graph-based constraint is used to predict accurate labels for unlabeled data, and the experimental results show the method can obtain the smaller feature subsets and higher classification accuracy.…”
Section: Introductionmentioning
confidence: 99%