2022
DOI: 10.1109/tit.2022.3146206
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive and Oblivious Randomized Subspace Methods for High-Dimensional Optimization: Sharp Analysis and Lower Bounds

Abstract: We propose novel randomized optimization methods for high-dimensional convex problems based on restrictions of variables to random subspaces. We consider oblivious and data-adaptive subspaces and study their approximation properties via convex duality and Fenchel conjugates. A suitable adaptive subspace can be generated by sampling a correlated random matrix whose second order statistics mirror the input data. We illustrate that the adaptive strategy can significantly outperform the standard oblivious sampling… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 50 publications
0
1
0
Order By: Relevance
“…Standard RandNLA guarantees such as the subspace embedding are sufficient (although not necessary) to ensure that Ĥt provides a good enough approximation to enable accelerated local convergence in time Õ(nd). These approaches have also been extended to distributed settings via RMT-based model averaging, with applications in ensemble methods, distributed optimization, and federated learning [110,109,144,48,92]. Further RandNLA-based Newton-type methods include: Subsampled Newton [75,159,18,17]; Hessian approximations via randomized Taylor expansion [1] and low-rank approximation [77,55]; Hessian diagonal/trace estimates via Hutchinson's method [136] and Stochastic Lanczos Quadrature, particularly for non-convex problems, e.g., PyHessian [176], AdaHessian [177]; and finally Stochastic Quasi-Newton type methods [106,137].…”
Section: Hessian Sketchmentioning
confidence: 99%
See 1 more Smart Citation
“…Standard RandNLA guarantees such as the subspace embedding are sufficient (although not necessary) to ensure that Ĥt provides a good enough approximation to enable accelerated local convergence in time Õ(nd). These approaches have also been extended to distributed settings via RMT-based model averaging, with applications in ensemble methods, distributed optimization, and federated learning [110,109,144,48,92]. Further RandNLA-based Newton-type methods include: Subsampled Newton [75,159,18,17]; Hessian approximations via randomized Taylor expansion [1] and low-rank approximation [77,55]; Hessian diagonal/trace estimates via Hutchinson's method [136] and Stochastic Lanczos Quadrature, particularly for non-convex problems, e.g., PyHessian [176], AdaHessian [177]; and finally Stochastic Quasi-Newton type methods [106,137].…”
Section: Hessian Sketchmentioning
confidence: 99%
“…2 of a nearly-unbiased estimator, we can obtain a much smaller theory-practice gap than was possible with Classical RandNLA, which depended on the notion of a subspace embedding from Def. 1, for a broad range of implementations [44,109,110]. For implementations in the past, one often used expensive Gaussian random projections to obtain stronger theory; and then implementations may or may not have used Gaussian random projections; thus leading to a potentially-large theory practice gap.…”
Section: Connections With Modern Rmtmentioning
confidence: 99%
“…As the GeNI-ADMM framework covers any approximate ADMM scheme that replaces the x-subproblem by a linear system solve, our convergence theory covers any ADMM scheme that uses fast linear system solvers. Given the recent flurry of activity on fast linear system solvers within the (randomized) numerical linear algebra community [27,17,31], these results will help realize these benefits for optimization problems as well. To demonstrate the power of the GeNI-ADMM framework, we establish convergence of NysADMM and another randNLA-inspired scheme, sketch-and-solve ADMM, whose convergence was left as an open problem in [8].…”
Section: Introductionmentioning
confidence: 99%