2019
DOI: 10.1016/j.neunet.2019.03.008
|View full text |Cite
|
Sign up to set email alerts
|

Flexible unsupervised feature extraction for image classification

Abstract: Dimensionality reduction is one of the fundamental and important topics in the fields of pattern recognition and machine learning. However, most existing dimensionality reduction methods aim to seek a projection matrix W such that the projection W T x is exactly equal to the true low-dimensional representation. In practice, this constraint is too rigid to well capture the geometric structure of data. To tackle this problem, we relax this constraint but use an elastic one on the projection with the aim to revea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
2

Relationship

1
9

Authors

Journals

citations
Cited by 50 publications
(15 citation statements)
references
References 28 publications
(32 reference statements)
0
13
0
Order By: Relevance
“…Feature extraction is an important topic and the basis of pattern recognition and machine learning [30]. Principal component analysis is a method of feature extraction.…”
Section: Qualitative Identification Methods For Gas Mixturementioning
confidence: 99%
“…Feature extraction is an important topic and the basis of pattern recognition and machine learning [30]. Principal component analysis is a method of feature extraction.…”
Section: Qualitative Identification Methods For Gas Mixturementioning
confidence: 99%
“…Feature selection or gene selection is a popular and powerful approach in medical datasets to overcome this shortcoming [33][34][35]. In gene selection, to decrease the microarray data dimensions, by eliminating the irrelevant and similar genes, only a subset of relevant and dissimilar genes that are strongly related to the objective function are selected [36]. This is a powerful strategy to increase the efficiency of the machine learning algorithm, reduce time complexity, build more general classification algorithm, and reduce storage requirements [37,38].…”
Section: -3 Feature Selectionmentioning
confidence: 99%
“…Both methods for the reduction of dimensionality are designed to improve learning efficiency, minimize computational complexity, develop more generalizable models, and reduce needed storage [6][7][8]. Feature selection has been an active research area in data mining, pattern recognition, and statistics communities [9]. The total search space to find the most relevant and non-redundant features, including all possible subsets, is 2 , where is the number of original features.…”
Section: Introductionmentioning
confidence: 99%