2019
DOI: 10.1155/2019/1978154
|View full text |Cite
|
Sign up to set email alerts
|

A New Smoothed L0 Regularization Approach for Sparse Signal Recovery

Abstract: Sparse signal reconstruction, as the main link of compressive sensing (CS) theory, has attracted extensive attention in recent years. The essence of sparse signal reconstruction is how to recover the original signal accurately and effectively from an underdetermined linear system equation (ULSE). For this problem, we propose a new algorithm called regularization reweighted smoothed L0 norm minimization algorithm, which is simply called RRSL0 algorithm. Three innovations are made under the framework of this met… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 36 publications
0
10
0
Order By: Relevance
“…Different approaches have been proposed to find an approximate solution of or of the (similar) problem: or, in the noiseless case: which are NP-hard. Among others, one strategy is to approximate the norm by a differentiable version [ 29 , 30 , 31 , 32 ]. Another possibility, widely explored, is to use a greedy strategy in the spirit of matching pursuit [ 33 ], initially proposed in the underdetermined case, (sparse coding), which evolved to orthogonal matching pursuit (OMP) and orthogonal least squares (OLS).…”
Section: Methodsmentioning
confidence: 99%
“…Different approaches have been proposed to find an approximate solution of or of the (similar) problem: or, in the noiseless case: which are NP-hard. Among others, one strategy is to approximate the norm by a differentiable version [ 29 , 30 , 31 , 32 ]. Another possibility, widely explored, is to use a greedy strategy in the spirit of matching pursuit [ 33 ], initially proposed in the underdetermined case, (sparse coding), which evolved to orthogonal matching pursuit (OMP) and orthogonal least squares (OLS).…”
Section: Methodsmentioning
confidence: 99%
“…Since it's an NP-hard to solve directly, some previous works turn around it by relaxing ℓ 0 to ℓ 1 , which is a convex function closest to ℓ 0 . Inspired by the research field of smoothed-ℓ 0 optimization, we use the following smoothed-ℓ 0 function [34] as our gate for LPFS:…”
Section: Lpfsmentioning
confidence: 99%
“…Inspired by the study of smoothed-ℓ 0 [7,23,24,34] in compressed sensing, we consider feature selection as an optimization problem under ℓ 0 -norm constraint, and propose a novel Learnable Polarizing Feature Selection (LPFS) method to effectively select highly informative features. We insert such differentiable function-based gates between the embedding and the input layer of the network to control the active and inactive state of these features.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…of metrics like Bit Error Rate (BER) and Mean Square Error (MSE) is much better than LS channel estimation technique [6][7][8][9][10]. Based on the theory that the transmission channel is sparse with only few major channel coefficients, compressive sensing based channel estimation algorithms are also gaining popularity [11][12][13][14][15][16][17][18]. As claimed by the compressive sensing theory, if the signal is sparse in its known basis, then fewer measurements of the signal may be needed to represent the signal in its compressed form [19][20][21].…”
Section: Introductionmentioning
confidence: 99%