2020
DOI: 10.1109/tsp.2020.2967714
|View full text |Cite
|
Sign up to set email alerts
|

Subspace Learning and Feature Selection via Orthogonal Mapping

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 33 publications
0
2
0
Order By: Relevance
“…Let X ∈ R n×d be the high-dimensional data, through subspace learning, we can learn a low-dimensional subspace of X that can well represent the original data [7], [9]. Its formula can be expressed as follows:…”
Section: Proposed Methodsmentioning
confidence: 99%
“…Let X ∈ R n×d be the high-dimensional data, through subspace learning, we can learn a low-dimensional subspace of X that can well represent the original data [7], [9]. Its formula can be expressed as follows:…”
Section: Proposed Methodsmentioning
confidence: 99%
“…When data dimension is too large, it becomes difficult to choose the best solution from possibilities (Kanwal et al. 2021 ; Mandanas and Kotropoulos 2020 ). The goal is to reduce the number of features and improve classification quality, so it is optimized as a single- or multi-objective (Lima et al.…”
Section: Binary Metaheuristic Algorithms In Applicationsmentioning
confidence: 99%