2011
DOI: 10.1137/100810447
|View full text |Cite
|
Sign up to set email alerts
|

New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property

Abstract: Consider an m×N matrix Φ with the Restricted Isometry Property of order k and level δ, that is, the norm of any k-sparse vector in R N is preserved to within a multiplicative factor of 1±δ under application of Φ. We show that by randomizing the column signs of such a matrix Φ, the resulting map with high probability embeds any fixed set of p = O(e k ) points in R N into R m without distorting the norm of any point in the set by more than a factor of 1 ± 4δ. Consequently, matrices with the Restricted Isometry P… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

7
276
1
1

Year Published

2013
2013
2019
2019

Publication Types

Select...
6
2
1

Relationship

4
5

Authors

Journals

citations
Cited by 238 publications
(289 citation statements)
references
References 37 publications
7
276
1
1
Order By: Relevance
“…Random matrices known to have this property include matrices with independent subgaussian entries (such as Gaussian or Bernoulli matrices), see for example [DG03]. Moreover, it is shown in [KW11] that any matrix that satisfies the classical RIP will satisfy the Johnson-Lindenstrauss lemma and thus the D-RIP with high probability after randomizing the signs of the columns. The latter construction allows for structured random matrices with fast multiplication properties such as randomly subsampled Fourier matrices (in combination with the results from [RV08]) and matrices representing subsampled random convolutions (in combination with the results from [RRT12, KMR14]); in both cases, however, again with randomized column signs.…”
Section: Theorem 22 ([Cenr10])mentioning
confidence: 99%
See 1 more Smart Citation
“…Random matrices known to have this property include matrices with independent subgaussian entries (such as Gaussian or Bernoulli matrices), see for example [DG03]. Moreover, it is shown in [KW11] that any matrix that satisfies the classical RIP will satisfy the Johnson-Lindenstrauss lemma and thus the D-RIP with high probability after randomizing the signs of the columns. The latter construction allows for structured random matrices with fast multiplication properties such as randomly subsampled Fourier matrices (in combination with the results from [RV08]) and matrices representing subsampled random convolutions (in combination with the results from [RRT12, KMR14]); in both cases, however, again with randomized column signs.…”
Section: Theorem 22 ([Cenr10])mentioning
confidence: 99%
“…This can be interpreted as a dictionary version of the universality property discussed above. It has now been shown that several classes of random matrices satisfy this property as well as subsampled structured matrices, after applying random column signs [DG03,KW11]. The matrices in both of these cases are motivated by application scenarios, but typically in applications they appear without the randomized column signs.…”
mentioning
confidence: 99%
“…it is more efficient to use a fast random projection [1,9] which results in a projection cost of O(nd log d). Right protection involves multiplying each entry of the projection data matrix of size k × n with a scalar, with a cost of O(kn).…”
Section: Complexitymentioning
confidence: 99%
“…This notion, due to Candès and Tao [11], was intensively studied during the last decade and found various applications and connections to several areas of theoretical computer science, including sparse recovery [8,20,27], coding theory [14], norm embeddings [6,23], and computational complexity [4,31,25]. The original motivation for the restricted isometry property comes from the area of compressed sensing.…”
Section: Introductionmentioning
confidence: 99%