2017
DOI: 10.1137/15m1044680
|View full text |Cite
|
Sign up to set email alerts
|

Randomized QR with Column Pivoting

Abstract: Abstract. The dominant contribution to communication complexity in factorizing a matrix using QR with column pivoting is due to column-norm updates that are required to process pivot decisions. We use randomized sampling to approximate this process which dramatically reduces communication in column selection. We also introduce a sample update formula to reduce the cost of sampling trailing matrices. Using our column selection mechanism we observe results that are comparable in quality to those obtained from th… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
62
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 73 publications
(62 citation statements)
references
References 17 publications
0
62
0
Order By: Relevance
“…Remark 3.2. Algorithm 1 can also be combined with the trQRCP algorithm [14]. Although trQRCP generates a column permutation matrix as well, it does not affect the Frobenius norm of each partial or the entire matrix of B.…”
mentioning
confidence: 99%
“…Remark 3.2. Algorithm 1 can also be combined with the trQRCP algorithm [14]. Although trQRCP generates a column permutation matrix as well, it does not affect the Frobenius norm of each partial or the entire matrix of B.…”
mentioning
confidence: 99%
“…with an approximation quality closely related to the error term in R 22 . The Randomized QRCP (RQRCP) algorithm [17,75] is a more efficient variant of Algorithm 4. RQRCP generates a Gaussian random matrix Ω ∈ N (0, 1) Form the initial sample matrix B = ΩA and initialize Π = I n .…”
Section: Preliminaries and Backgroundmentioning
confidence: 99%
“…The pivoted QLP decomposition is extensively analyzed in Huckaby and Chan [34,35]. More recently, Duersch and Gu developed a much more efficient variant of the pivoted QLP decomposition, TUXV, and demonstrated its remarkable quality as a low-rank approximation empirically without a rigorous theoretical justification of TUXV's success [17].…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…We assess the quality of lowrank approximation by reconstructing a gray-scale image of a differential gear of size 1280 × 804, taken from [55], using CoR-UTV, truncated QRCP, and the truncated SVD by using (widely recommended) PROPACK package [110]. The PROPACK function provides an efficient algorithm to compute a specified number of largest singular values and corresponding singular vectors of a given matrix by making use of the Lanczos bidiagonalization algorithm with partial reorthogonalization, which is suitable for approximating large low-rank matrices.…”
Section: ) Image Reconstructionmentioning
confidence: 99%