2020
DOI: 10.48550/arxiv.2010.09791
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Tensor-structured sketching for constrained least squares

Abstract: In this work, we study a tensor-structured random sketching matrix to project a largescale convex optimization problem to a much lower-dimensional counterpart, which leads to huge memory and computation savings. We show that while maintaining the prediction error between a random estimator and the true solution with high probability, the dimension of the projected problem obtains optimal dependence in terms of the geometry of the constraint set. Moreover, the tensor structure and sparsity pattern of the struct… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 26 publications
0
1
0
Order By: Relevance
“…Kronecker products of independent Gaussian vectors, and proved embedding properties for such constructions for Kronecker products of order d = 2. The paper [2] extended the analysis beyond d = 2, and [15] further refined and extended these results in the context of sketching constrained least squares problems.…”
Section: Related Workmentioning
confidence: 99%
“…Kronecker products of independent Gaussian vectors, and proved embedding properties for such constructions for Kronecker products of order d = 2. The paper [2] extended the analysis beyond d = 2, and [15] further refined and extended these results in the context of sketching constrained least squares problems.…”
Section: Related Workmentioning
confidence: 99%