The platform will undergo maintenance on Sep 14 at about 9:30 AM EST and will be unavailable for approximately 1 hour.
2020
DOI: 10.1080/10556788.2020.1734003
|View full text |Cite
|
Sign up to set email alerts
|

Stability analysis of a class of sparse optimization problems

Abstract: The sparse optimization problems arise in many areas of science and engineering, such as compressed sensing, image processing, statistical and machine learning. The ℓ0-minimization problem is one of such optimization problems, which is typically used to deal with signal recovery. The ℓ1-minimization method is one of the plausible approaches for solving the ℓ0-minimization problems, and thus the stability of such a numerical method is vital for signal recovery. In this paper, we establish a stability result for… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 33 publications
0
5
0
Order By: Relevance
“…Although the experiments have shown that DRAtyped algorithms outperform 1 -minimization and some classic reweighted 1 -algorithms, there still exist some future work to do. For example, the convergence and the stability of DRA-typed algorithms are worthwhile future work, which might be investigated under certain assumptions such as the so-called restricted weak range space property (see, e.g., [34]).…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Although the experiments have shown that DRAtyped algorithms outperform 1 -minimization and some classic reweighted 1 -algorithms, there still exist some future work to do. For example, the convergence and the stability of DRA-typed algorithms are worthwhile future work, which might be investigated under certain assumptions such as the so-called restricted weak range space property (see, e.g., [34]).…”
Section: Discussionmentioning
confidence: 99%
“…In this case, the difficulty for solving the problems ( 33) and ( 34) is that ε (λ 6 ) might attain an infinite value when w i → ∞. We may introduce a bounded merit function ε ∈ F into (33) and (34) so that the value of ε (λ 6 ) is finite. Moreover, to avoid the infinite optimal value in the model ( 33), w ∈ ζ can be relaxed to −λ 1 − λ T 2 b + λ T 3 y ≤ 1 due to the weak duality.…”
Section: One-step Dual-density-based Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…e stability of recovery means that recovery errors stay under control even if the measurements are slightly inaccurate and the data are not exactly sparse. Recent stability study for CS can be found in [21][22][23][24][25]. However, few theoretical results are available on the stability of 1-bit CS.…”
Section: Introductionmentioning
confidence: 99%
“…) * , where T * α and (T P 0 α ) * are the solution of (18) and (33) (cf (24). and (34)) and T P 0 α is given as(25) with P ≔ P 0 .…”
mentioning
confidence: 99%