2016
DOI: 10.1016/j.patcog.2015.07.008
|View full text |Cite
|
Sign up to set email alerts
|

Regularized generalized eigen-decomposition with applications to sparse supervised feature extraction and sparse discriminant analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 22 publications
(7 citation statements)
references
References 34 publications
(58 reference statements)
0
7
0
Order By: Relevance
“…The generalized Hermitian eigenvalue problem (GHEP) [1] is of great interest in signal processing, machine learning and data analysis applications. The GHEP algorithms provide powerful tools to treat problems in blind source separation [2,3], feature extraction [4,5], noise filtering [6], fault detection [7], antenna array processing [8], classification [9], and speech enhancement [10]. Traditional methods for solving the GHEP include power and inverse iteration based methods, Lanczos method and Jacobi-Davidson method [1,11].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The generalized Hermitian eigenvalue problem (GHEP) [1] is of great interest in signal processing, machine learning and data analysis applications. The GHEP algorithms provide powerful tools to treat problems in blind source separation [2,3], feature extraction [4,5], noise filtering [6], fault detection [7], antenna array processing [8], classification [9], and speech enhancement [10]. Traditional methods for solving the GHEP include power and inverse iteration based methods, Lanczos method and Jacobi-Davidson method [1,11].…”
Section: Introductionmentioning
confidence: 99%
“…The generalized eigenvalues and eigenvectors are extracted from a matrix pencil (A, B). In online applications [2,4,[6][7][8][9][10], however, this pair is unknown, and the rank-1 update strategy [14-16, 18, 19] uses the observed streaming stochastic signals to estimate it. Also, in many cases, the signal subspace spanned by the dominant generalized eigenvectors, lies in a low-dimensional space [10].…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, tensor-based methods have been proposed because they can preserve the overall spatial structure, and they have been applied in several areas, including biological [21] and medical research [22], facial recognition [23], [24], natural image processing [25], hyperspectral image analysis [26]- [28], [42] and PolSAR image processing [29], [30]. Lower rank tensor approximation (LTRA) [26] is an extension of PCA to higher-order data sets that assumes strong global correlations in the spatial and feature domains.…”
Section: Introductionmentioning
confidence: 99%
“…() consider a regularized GEP with 0‐norm penalty and used a majorization–minimization (MM) algorithm to solve the non‐convex problem. Han and Clemmensen () transformed GEP to an eigen‐decomposition and use alternating direction method of multipliers (ADMM) to obtain sparse eigenvectors, which is a solution to an optimization problem with an 1 penalty.…”
Section: Introductionmentioning
confidence: 99%