2021
DOI: 10.1007/s11432-020-3063-0
|View full text |Cite
|
Sign up to set email alerts
|

Semi-supervised local feature selection for data classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 39 publications
(5 citation statements)
references
References 30 publications
0
3
0
Order By: Relevance
“…We first used five image classification datasets to test the classification performance of the proposed method and then employed two image datasets and two subsets of UCI data to verify the clustering performance of the proposed method. In the experiment, we compared our proposed method with some contemporary UFS and SSFS methods, including two UFS methods (SPNFSR [56] and NNSAFS [57]) and six SSFS methods (RLSR [19], FDEFS [50], GS 3 FS [43], S2LFS [44], AGLRM [47], and ASLCGLFS [48]).…”
Section: Experiments and Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…We first used five image classification datasets to test the classification performance of the proposed method and then employed two image datasets and two subsets of UCI data to verify the clustering performance of the proposed method. In the experiment, we compared our proposed method with some contemporary UFS and SSFS methods, including two UFS methods (SPNFSR [56] and NNSAFS [57]) and six SSFS methods (RLSR [19], FDEFS [50], GS 3 FS [43], S2LFS [44], AGLRM [47], and ASLCGLFS [48]).…”
Section: Experiments and Analysismentioning
confidence: 99%
“…The second problem is that the spatial distribution of the sample label information is not sufficiently considered, resulting in the weak discriminative ability of the selected features, which further leads to poor classification or clustering performance. To alleviate this issue, label propagation (LP) has been incorporated into the FS methods [42][43][44]. However, since LP is also a graph-learning-based algorithm, the quality of the learned graph affects the performance to some extent.…”
Section: Introductionmentioning
confidence: 99%
“…The majority of existing approaches are designed for only one type of community structure. For instance, the methods based on random walk for assortative community, [18] the spectral methods for assortative community [16] and for bipartite community, [14] the optimization methods for assortative community, [17] bipartite community, [9] and both of them, [26] the methods based on statistical inference for assortative community [21,22,24,25] and bipartite community, [23] and the deep learning methods for assortative community. [28,29] To adapt one approach for another community structure considerable work is needed.…”
Section: Related Workmentioning
confidence: 99%
“…But the community exploration through statistical models differs from data classification by feature selection. [26] Our model aims to learn the latent and unknown features described by our assumption, whereas feature selection for data classification aims to learn a subset of features that have already been given.…”
Section: Introductionmentioning
confidence: 99%
“…It offers quick training and effective learning outcomes, which are important characteristics. Due to this, the distributed extreme learning machine (D-ELM) avoids reading all samples into memory at once by partitioning matrix operations, but it also resolves the issue of memory shortage while training vast amounts of sample data [ 6 , 7 ]. Every time they run, they only train an ELM network with a certain amount of hidden layer nodes.…”
Section: Introductionmentioning
confidence: 99%