Proceedings of the Twenty-Fifth Annual ACM-SIAM Symposium on Discrete Algorithms 2013
DOI: 10.1137/1.9781611973402.112
|View full text |Cite
|
Sign up to set email alerts
|

Model-based Sketching and Recovery with Expanders

Abstract: Linear sketching and recovery of sparse vectors with randomly constructed sparse matrices has numerous applications in several areas, including compressive sensing, data stream computing, graph sketching, and combinatorial group testing. This paper considers the same problem with the added twist that the sparse coefficients of the unknown vector exhibit further correlations as determined by a known sparsity model. We prove that exploiting model-based sparsity in recovery provably reduces the sketch size withou… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
36
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
4
2
1

Relationship

5
2

Authors

Journals

citations
Cited by 10 publications
(37 citation statements)
references
References 34 publications
(130 reference statements)
1
36
0
Order By: Relevance
“…We retain the complexity of n but got a smaller complexity for d, which is novel. Related results with a similar d were derived in [27,2] but for structure sparse signals in the framework of model-based compressing sensing or sketching. In that framework, one has second order information about x beyond simple sparsity, which is first order information about x.…”
Section: Introductionsupporting
confidence: 56%
“…We retain the complexity of n but got a smaller complexity for d, which is novel. Related results with a similar d were derived in [27,2] but for structure sparse signals in the framework of model-based compressing sensing or sketching. In that framework, one has second order information about x beyond simple sparsity, which is first order information about x.…”
Section: Introductionsupporting
confidence: 56%
“…In Section 6, we show that an approximationtolerant approach similar to AM-IHT succeeds even when the measurement matrix A is itself sparse. Our approach leverages the notion of the restricted isometry property in the ℓ 1 -norm, also called the RIP-1, which was first introduced in [23] and developed further in the model-based context by [27,24,25]. For sparse A, we propose a modification of AM-IHT, which we call AM-IHT with RIP-1.…”
Section: Paper Outlinementioning
confidence: 99%
“…The paper [27] establishes both lower and upper bounds on the number of measurements required to satisfy the model RIP-1 for certain structured sparsity models. Assuming that the measurement matrix A satisfies the model RIP-1, the paper [25] proposes a modification of expander iterative hard thresholding (EIHT) [24], which achieves stable recovery for arbitrary structured sparsity models. As with the other algorithms for model-based compressive sensing, EIHT only works with exact model projection oracles.…”
Section: Prior Workmentioning
confidence: 99%
“…We call such an approximation G-group-sparse or in short group-sparse. The projection problem is a fundamental step in Model-based Iterative Hard-Thresholding algorithms for solving inverse problems by imposing group structures [7], [19]. More importantly, we seek to also identify the G-groupsupport of the approximationx, that is the G groups that constitute its support.…”
Section: G S ⊆ G |S| ≤ Gmentioning
confidence: 99%