2021
DOI: 10.1109/tsp.2021.3058442
|View full text |Cite
|
Sign up to set email alerts
|

Majorization-Minimization on the Stiefel Manifold With Application to Robust Sparse PCA

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
22
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(22 citation statements)
references
References 43 publications
0
22
0
Order By: Relevance
“…In practical settings for high-dimensional data, a variety of iterative local methods are often applied to solve nonconvex problems over the Stiefel manifold, from gradient ascent by geodesics [2; 18; 1] and more recently majorization-minimization (MM) algorithms, where Breloy et al [14] applied MM methods to solve (1) with guarantees of convergence to a stationary point. While the computational complexity and memory requirements of these solvers scale well, their obtained solutions lack any global optimality guarantees.…”
Section: Dual Certificate Of the Sdpmentioning
confidence: 99%
See 3 more Smart Citations
“…In practical settings for high-dimensional data, a variety of iterative local methods are often applied to solve nonconvex problems over the Stiefel manifold, from gradient ascent by geodesics [2; 18; 1] and more recently majorization-minimization (MM) algorithms, where Breloy et al [14] applied MM methods to solve (1) with guarantees of convergence to a stationary point. While the computational complexity and memory requirements of these solvers scale well, their obtained solutions lack any global optimality guarantees.…”
Section: Dual Certificate Of the Sdpmentioning
confidence: 99%
“…Using [4,Section 6.6.3] it can be shown that computing the certificate only, with a given Ū , results in a substantial reduction in flops by a factor of O(d 3 /k) over solving (SDP-D). Subsequently, a first-order MM solver [14], whose cost is O(dk 2 + k 3 ) per iteration, combined with our global optimality certificate, is an obvious preference to solving the full SDP in (SDP-P) for large problems. See Appendix D.1 for more details.…”
Section: Dual Certificate Of the Sdpmentioning
confidence: 99%
See 2 more Smart Citations
“…Following this work, several LASSO and convex optimization (particularly semidefinite programming) approaches were developed to deal with the sparsity of the singular vectors [11], [12], [13], [14]. There are numerous ways to deal with the sPCA and these include approaches such as: greedy methods [15], geodesic steepest descent [16], Givens rotations [17], low rank approximations via a regularized (sparsity promoting) singular value decomposition [18], truncated power iterations [19], [20], [21], steepest descent on the Stiefel manifold using rotations matrices [22] or on the Grassmannian manifold [23], quasi-Newton optimization for the sparse generalized eigenvalue problem [24], iterative deflation techniques [25], the minorization-maximization framework [26] with additional orthogonality constraints [27] or on the Stiefel manifold [28],…”
Section: Introductionmentioning
confidence: 99%