2015
DOI: 10.1007/s10115-015-0901-0
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble constrained Laplacian score for efficient and robust semi-supervised feature selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 27 publications
0
7
0
Order By: Relevance
“…In this section, 11 representative traditional feature selection methods are employed to compare with OSFS-KW, including Fisher [36], spectral feature selection [37], Pearson correlation coefficient (PCC) [38], ReliefF [39], Laplacian Score [7], a unsupervised feature selection method with ordinal locality (UFSOL) [40], mutual information (MI) [41], the infinite latent feature selection method (ILFS) [42], lasso regression (Lasso) [43], a fast correlation-based filter method (FCBF) [44], and a correlation based feature selection approach (CFS) [45].…”
Section: Osfs-kw Versus Traditional Feature Selection Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, 11 representative traditional feature selection methods are employed to compare with OSFS-KW, including Fisher [36], spectral feature selection [37], Pearson correlation coefficient (PCC) [38], ReliefF [39], Laplacian Score [7], a unsupervised feature selection method with ordinal locality (UFSOL) [40], mutual information (MI) [41], the infinite latent feature selection method (ILFS) [42], lasso regression (Lasso) [43], a fast correlation-based filter method (FCBF) [44], and a correlation based feature selection approach (CFS) [45].…”
Section: Osfs-kw Versus Traditional Feature Selection Methodsmentioning
confidence: 99%
“…Traditional methods are developed based on the assumption that all features are available. Many typical approaches exist, such as ReliefF [5], Fisher Score [6], mutual information (MI) [4], Laplacian Score [7], LASSO [8], and so on [9]. The main benefits of feature selection include speeding up the model training, avoiding overfitting, and reducing the impact of dimensionality during the process of data analysis [4].…”
Section: Introductionmentioning
confidence: 99%
“…. , 7) and by two semi-supervised feature selection methods namely the semisupervised pairwise constraint-guided sparse (SCGS) learning method [22] and the ensemble constrained Laplacian score (EnsCLS) method [16].…”
Section: Semi-supervised Feature Selectionmentioning
confidence: 99%
“…Problem transformation approaches usually ignore label correlation, or produce a large number of new labels which reduce the effectiveness of the multi-label feature selection approach. Direct feature selection called adapt feature selection improve traditional single-label feature selection approaches to achieve feature selection in multi-label data directly [32]- [34]. In [35], a multi-label feature selection based on Graph-Margin is proposed to utilize graph to represent multi-label data and measure features based on large margin theory.…”
Section: B Multi-label Feature Selection Approachesmentioning
confidence: 99%