2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS) 2017
DOI: 10.1109/focs.2017.64
|View full text |Cite
|
Sign up to set email alerts
|

Optimality of the Johnson-Lindenstrauss Lemma

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
42
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 60 publications
(46 citation statements)
references
References 17 publications
4
42
0
Order By: Relevance
“…By [56,78], the vertices of (n − 1)-simplex embed isometrically into any infinite dimensional Banach space, so we have thus justified the bound (16), and hence in particular the first lower bound on k α n ( 2 ) in (10). As we already explained, the second lower bound (for the almost-isometric regime) on k α n ( 2 ) in (10) is due to the very recent work [148]. The upper bound on k α n ( 2 ) in (10), namely that in (25) we can take…”
Section: Finite Subsets Of Hilbert Spacementioning
confidence: 87%
See 1 more Smart Citation
“…By [56,78], the vertices of (n − 1)-simplex embed isometrically into any infinite dimensional Banach space, so we have thus justified the bound (16), and hence in particular the first lower bound on k α n ( 2 ) in (10). As we already explained, the second lower bound (for the almost-isometric regime) on k α n ( 2 ) in (10) is due to the very recent work [148]. The upper bound on k α n ( 2 ) in (10), namely that in (25) we can take…”
Section: Finite Subsets Of Hilbert Spacementioning
confidence: 87%
“…The present article is focused on embeddings that permit large errors, and in particular in ways to prove impossibility results even if large errors are allowed. For this reason, we will not describe here the ideas of the proof in [148] that pertains to the almost-isometric regime.…”
Section: 22mentioning
confidence: 99%
“…The MLP training finds a distance-preserving lowdimensional embedding function f specific to a given x t . The existence of such embedding is guaranteed by the Johnson-Lindenstrauss lemma [13], [14], under a more general setting which does not have the restriction about the embedding being specific to a given vector.…”
Section: A Learning the Del Function With Multi-layer Perceptronmentioning
confidence: 99%
“…The first one is to improve the sketching dimension. This, however, is known to be impossible for various regimes, see [12,63,65,72] and very recently for any embedding method by Larsen and Nelson [73]. The other direction is to make the sketching matrix sparse.…”
Section: Lemma 11 (Distributional Johnson-lindenstrauss Lemma) There mentioning
confidence: 99%