2012
DOI: 10.1214/12-aos1057
|View full text |Cite
|
Sign up to set email alerts
|

Accuracy guaranties for $\ell_{1}$ recovery of block-sparse signals

Abstract: We introduce a general framework to handle structured models (sparse and block-sparse with possibly overlapping blocks). We discuss new methods for their recovery from incomplete observation, corrupted with deterministic and stochastic noise, using block-ℓ1 regularization. While the current theory provides promising bounds for the recovery errors under a number of different, yet mostly hard to verify conditions, our emphasis is on verifiable conditions on the problem parameters (sensing matrix and the block st… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
25
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 12 publications
(25 citation statements)
references
References 27 publications
0
25
0
Order By: Relevance
“…Much progress has been made over the last decade in high-dimensional statistics where the number of unknown parameters greatly exceeds sample size. The vast majority of work has been pursued for point estimation such as consistency for prediction [7,21], oracle inequalities and estimation of a high-dimensional parameter [6,11,12,24,33,34,47,51] or variable selection [17,30,49,53]. Other references and exposition to a broad class of models can be found in [18] or [10].…”
mentioning
confidence: 99%
See 2 more Smart Citations
“…Much progress has been made over the last decade in high-dimensional statistics where the number of unknown parameters greatly exceeds sample size. The vast majority of work has been pursued for point estimation such as consistency for prediction [7,21], oracle inequalities and estimation of a high-dimensional parameter [6,11,12,24,33,34,47,51] or variable selection [17,30,49,53]. Other references and exposition to a broad class of models can be found in [18] or [10].…”
mentioning
confidence: 99%
“…So under (A2) for suitable λ ≍ log(p)/n β − β 0 2 = O P ( s 0 log(p)/n) (24) (see also [6]). This result will be applied in the next subsection, albeit to the lasso for node wise regression instead of for the original linear model.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…As we are here focusing on the general problem, occurring also for numerous other forms of signals, we have here opted to exclude the treatment of inharmonicity, although note that the algorithm may be extended to allow for this along the lines presented in [31,32], or using a dictionary learning approach such as in [33,34]. The theoretical study of block sparse signals was initially suggested in [35], where it is shown that including this structure in the estimation procedure has great practical consequences, improving both theoretical recovery limits and numerical results in many cases (see, e.g., [35][36][37][38]). Generally, this form of group sparse convex optimization problems are computationally cumbersome; for this reason, we also derive an efficient algorithm to form the estimate based on the alternating directions methods of multipliers (ADMM) (see, e.g., [39,40]).…”
Section: Introductionmentioning
confidence: 99%
“…We present the respective sparsity structures and provide verifiable sufficient conditions for the validity of associated nullspace properties (and thus -for the validity of the corresponding recovery routines); the prototypes of our verifiable conditions can be found in [5,6,7,8].…”
Section: Introductionmentioning
confidence: 99%