2020 International Conference on Signal Processing and Communications (SPCOM) 2020
DOI: 10.1109/spcom50965.2020.9179594
|View full text |Cite
|
Sign up to set email alerts
|

Sample Complexity Lower Bounds for Compressive Sensing with Generative Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
52
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 15 publications
(54 citation statements)
references
References 18 publications
2
52
0
Order By: Relevance
“…Surprisingly, the sharp sample complexity for the squares decoder (3)-( 2) can be derived in this paper even if the measurements are highly quantized and corrupted by noise and sign flips. Very recently, under generative prior, [33] and [42] derived sample complexity results for 1-bit CS. The sample complexity obtained in [33] is O(k log L) under the assumption that the generator G is L-Lipschitz continuous and the rows of A are i.i.d.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Surprisingly, the sharp sample complexity for the squares decoder (3)-( 2) can be derived in this paper even if the measurements are highly quantized and corrupted by noise and sign flips. Very recently, under generative prior, [33] and [42] derived sample complexity results for 1-bit CS. The sample complexity obtained in [33] is O(k log L) under the assumption that the generator G is L-Lipschitz continuous and the rows of A are i.i.d.…”
Section: Related Workmentioning
confidence: 99%
“…Very recently, under generative prior, [33] and [42] derived sample complexity results for 1-bit CS. The sample complexity obtained in [33] is O(k log L) under the assumption that the generator G is L-Lipschitz continuous and the rows of A are i.i.d. sampled from N (0, I).…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations