Seventh IEEE International Conference on Data Mining (ICDM 2007) 2007
DOI: 10.1109/icdm.2007.89
|View full text |Cite
|
Sign up to set email alerts
|

Spectral Regression: A Unified Approach for Sparse Subspace Learning

Abstract: Recently the problem of dimensionality reduction (or, subspace learning)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
119
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 184 publications
(119 citation statements)
references
References 17 publications
0
119
0
Order By: Relevance
“…For a multi-class problem, the authors suggested the one against the rest scheme by considering all two-class problems. Recent studies [4][5][6][7]9] show that various linear dimensionality reduction algorithms can be formulated as regression problems and thus have efficient computational solutions. Particularly, our previous work [6] has demonstrated that LDA can be formulated as a regression problem and be efficiently solved.…”
Section: Introductionmentioning
confidence: 99%
“…For a multi-class problem, the authors suggested the one against the rest scheme by considering all two-class problems. Recent studies [4][5][6][7]9] show that various linear dimensionality reduction algorithms can be formulated as regression problems and thus have efficient computational solutions. Particularly, our previous work [6] has demonstrated that LDA can be formulated as a regression problem and be efficiently solved.…”
Section: Introductionmentioning
confidence: 99%
“…PCA, LDA and Locality Preserving Projection (LPP) [12], sparse LDA (SLDA) [5]. We also compare it with the feature selection methods, e.g., Fisher score (FS) and Linear Discriminant Feature Selection (LDFS) [16].…”
Section: Methodsmentioning
confidence: 99%
“…LDA has been successfully applied to face recognition [2]. Following LDA, many incremental works have been done, e.g., Uncorrelated LDA and Orthogonal LDA [26], Local LDA [24], Semi-supervised LDA [4] and Sparse LDA [17] [5]. Note that all these methods suffer from the weakness of using all the original features to learn the subspace.…”
Section: Linear Discriminant Analysismentioning
confidence: 99%
See 2 more Smart Citations