2011
DOI: 10.1049/iet-spr.2010.0083
|View full text |Cite
|
Sign up to set email alerts
|

Sparsity regularised recursive least squares adaptive filtering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
33
0
1

Year Published

2011
2011
2016
2016

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 56 publications
(35 citation statements)
references
References 11 publications
0
33
0
1
Order By: Relevance
“…We name the resulting algorithms as 1 -RTLS and 0 -RTLS when 1 norm and approximate 0 norm penalty functions are employed, respectively. Similar to the sparsity regularized RLS algorithms [3,4], our algorithms do not require heavy matrix calculations, and they have O(L 2 ) operational complexity per time instant. Furthermore, it can be deduced that our algorithms rely on first order approximation, and hence they are Algorithm 1 Sparsity regularized RTLS procedure.…”
Section: Sparsity Regularized Rtlsmentioning
confidence: 97%
See 1 more Smart Citation
“…We name the resulting algorithms as 1 -RTLS and 0 -RTLS when 1 norm and approximate 0 norm penalty functions are employed, respectively. Similar to the sparsity regularized RLS algorithms [3,4], our algorithms do not require heavy matrix calculations, and they have O(L 2 ) operational complexity per time instant. Furthermore, it can be deduced that our algorithms rely on first order approximation, and hence they are Algorithm 1 Sparsity regularized RTLS procedure.…”
Section: Sparsity Regularized Rtlsmentioning
confidence: 97%
“…In [2], the authors introduced SPARLS algorithm which relies on 1 norm regularization and expectation-maximization (EM). RLS algorithms regularized with weighted 1 norm and approximate 0 norm were presented in [3] and [4], respectively. Greedy sparse RLS algorithm based on optimized orthogonal matching pursuit (OMP) was suggested in [5].…”
Section: Introductionmentioning
confidence: 99%
“…In this case, the proposed algorithm is called CIMRGMCC.Remark From Table , we know that the ZARGMCC and CIMRGMCC will reduce to ZARLS and CIMRLS when f ( e ( n )) = 1 and p = 2. While they will reduce to RMCC when ρ = 0 and p = 2.…”
Section: Sparse Recursive Generalized Maximum Correntropy Criterion Amentioning
confidence: 99%
“…Simulation results confirm the efficiency of the proposed algorithm. Note that in the literature the sparsity inducing penalty terms have been successfully used in linear adaptive filtering algorithms, such as least mean square (LMS) and recursive least squares (RLS) [11][12][13][14][15][16][17]. These sparsity regularized algorithms are shown to be highly efficient for sparse signal estimation or system identification.…”
Section: Introductionmentioning
confidence: 99%