2010
DOI: 10.1016/j.sigpro.2010.05.015
|View full text |Cite
|
Sign up to set email alerts
|

Convergence analysis of sparse LMS algorithms with l1-norm penalty based on white input signal

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

3
101
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 139 publications
(104 citation statements)
references
References 6 publications
3
101
0
Order By: Relevance
“…Several sparsity-aware modifications of the standard LMS have been introduced in the literature [7,8,9,10,11,12,13,24].…”
Section: Standard Lmsmentioning
confidence: 99%
See 1 more Smart Citation
“…Several sparsity-aware modifications of the standard LMS have been introduced in the literature [7,8,9,10,11,12,13,24].…”
Section: Standard Lmsmentioning
confidence: 99%
“…A penalty in the form of the l 0 -pseudo-norm of the CIR is used in [8], while [7] uses the l 1 -norm. In [9], the mean square convergence and stability analysis for one of the algorithms in [7] for the case of white input signals is presented. A performance analysis of the l 0 -pseudo-norm constraint LMS algorithm of [8] is given in [10].…”
Section: Introductionmentioning
confidence: 99%
“…The first sparse LMS algorithm motivated by the CS technique is carried out by introducing a l 1 -norm constraint term into the basic LMS to exploit the in-nature sparse characteristics of broadband multi-path channel [24][25][26][27]. As a result, a zero attractor is given in the updating equation of the sparse LMS algorithm to put forward a zero-attracting (ZA) LMS (ZA-LMS) algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…As a result, a zero attractor is given in the updating equation of the sparse LMS algorithm to put forward a zero-attracting (ZA) LMS (ZA-LMS) algorithm. Furthermore, a reweighting ZA-LMS (RZA-LMS) was reported by using a sum-log constraint instead of the l 1 -norm penalty in the ZA-LMS algorithm [24,27]. Subsequently, the zero attracting techniques have been widely researched, and a great quantity of sparse LMS algorithms was exploited by using different norm constraints, such as l p -norm and smooth approximation l 0 -norm constraints [28][29][30][31][32].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation