2021
DOI: 10.48550/arxiv.2111.08244
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sparse Regularization with the $\ell_0$ Norm

Abstract: We consider a minimization problem whose objective function is the sum of a fidelity term, not necessarily convex, and a regularization term defined by a positive regularization parameter λ multiple of the ℓ 0 norm composed with a linear transform. This problem has wide applications in compressed sensing, sparse machine learning and image reconstruction. The goal of this paper is to understand what choices of the regularization parameter can dictate the level of sparsity under the transform for a global minimi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…Sparse models can improve accuracy of approximation. Sparse regularization is a popular way to learn the sparse solutions [5,38,39,42]. The readers are referred to [24] for an overview of sparse deep learning.…”
Section: Introductionmentioning
confidence: 99%
“…Sparse models can improve accuracy of approximation. Sparse regularization is a popular way to learn the sparse solutions [5,38,39,42]. The readers are referred to [24] for an overview of sparse deep learning.…”
Section: Introductionmentioning
confidence: 99%