2021
DOI: 10.1002/nla.2409
|View full text |Cite
|
Sign up to set email alerts
|

An extension of fast iterative shrinkage‐thresholding algorithm to Riemannian optimization for sparse principal component analysis

Abstract: Sparse principal component analysis (PCA), an important variant of PCA, attempts to find sparse loading vectors when conducting dimension reduction. This paper considers the nonsmooth Riemannian optimization problem associated with the ScoTLASS model1 for sparse PCA which can impose orthogonality and sparsity simultaneously. A Riemannian proximal method is proposed in the work of Chen et al.9 for the efficient solution of this optimization problem. In this paper, two acceleration schemes are introduced. First … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 33 publications
0
6
0
Order By: Relevance
“…• An inexact accelerated Riemannian proximal gradient method (IARPG) is used to solve the optimization problem from the proposed model. IARPG merges the Nesterov momentum acceleration technique for Riemannian optimization [HW21b] and the inexact Riemannian proximal gradient method (RPG) in [HW21c]. Note that though the existing RPGs are designed for general manifold, it has not been used for the manifold of fixed-rank matrices before.…”
Section: Our Work and Main Contributionsmentioning
confidence: 99%
“…• An inexact accelerated Riemannian proximal gradient method (IARPG) is used to solve the optimization problem from the proposed model. IARPG merges the Nesterov momentum acceleration technique for Riemannian optimization [HW21b] and the inexact Riemannian proximal gradient method (RPG) in [HW21c]. Note that though the existing RPGs are designed for general manifold, it has not been used for the manifold of fixed-rank matrices before.…”
Section: Our Work and Main Contributionsmentioning
confidence: 99%
“…where A ∈ R m×n is the data matrix. This model is a penalized version of the ScoTLASS model introduced in [JTU03] and it has been used in [CMSZ20,HW21b].…”
Section: Spca Testmentioning
confidence: 99%
“…where M is a finite dimensional Riemannian manifold. Such optimization problem is of interest due to many important applications including but not limit to compressed models [OLCO13], sparse principal component analysis [ZHT06,HW21b], sparse variable principal component analysis [US08, CMW13, XLY20], discriminative k-means [YZW08], texture and imaging inpainting [LRZM12], cosparse factor regression [MDC17], and low-rank sparse coding [ZGL + 13, SQ16].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations