2018
DOI: 10.1201/9781315113142
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Optimization Theory and Methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

1
83
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7

Relationship

4
3

Authors

Journals

citations
Cited by 43 publications
(84 citation statements)
references
References 0 publications
1
83
0
Order By: Relevance
“…Motivated by the new analysis tool introduced in [29], we develop the stability result for the model (2) in this paper under the assumption of restricted weak range space property (RSP) of order k (which will be introduced in next section). Our result extends the stability theorem for ℓ 1 -minimization established by Zhao et al [28][29][30].…”
Section: Introductionsupporting
confidence: 88%
See 2 more Smart Citations
“…Motivated by the new analysis tool introduced in [29], we develop the stability result for the model (2) in this paper under the assumption of restricted weak range space property (RSP) of order k (which will be introduced in next section). Our result extends the stability theorem for ℓ 1 -minimization established by Zhao et al [28][29][30].…”
Section: Introductionsupporting
confidence: 88%
“…The problem (C1) is often called the standard ℓ 0 -minimization problem [8,17,28]. Two structured sparsity models, called the nonnegative sparsity model [7,8,17,28] and the monotonic sparsity model (isotonic regression) [23,24], are also the special cases of the model (1). It is well known that ℓ 1 -minimization is a useful method to solve the ℓ 0 -minimization problem.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Both (1) and (2) are the central models for sparse signal recovery and sparse representation of data on their redundant bases. These models provide an essential basis for the development of the theory and algorithms for compressed sensing (see, e.g., [12,25,26,32,53]). The problem (1) has also been widely used in the fields of statistical regressions and wireless communications (see, e.g., [45,4,41]).…”
Section: Introductionmentioning
confidence: 99%
“…The problems (1) and (2) are NP-hard in general [46]. The plausible algorithms for such problems can be briefly categorized into the following classes: (i) Convex optimization methods (e.g., 1 -minimization [19], reweighed 1 -minimization [16,31,56], and dual-density-based reweighted 1 -minimization [53,54,55]); (ii) heuristic methods (such as matching pursuit [43], orthogonal matching pursuit [44,52], compressive sampling matching pursuit [47], and subspace pursuit [20]); (iii) thresholding methods (e.g., soft thresholding [21,22,24], hard thresholding [6,7,8,30], graded hard thresholding pursuits [10,11], and the 'firm' thresholding [51]); (iv) integer programming methods [4].…”
Section: Introductionmentioning
confidence: 99%