2013
DOI: 10.1002/acs.2428
|View full text |Cite
|
Sign up to set email alerts
|

Sparse leaky‐LMS algorithm for system identification and its convergence analysis

Abstract: SUMMARYIn this paper, a novel adaptive filter for sparse systems is proposed. The proposed algorithm incorporates a log‐sum penalty into the cost function of the standard leaky least mean square (LMS) algorithm, which results in a shrinkage in the update equation. This shrinkage, in turn, enhances the performance of the adaptive filter, especially, when the majority of unknown system coefficients are zero. Convergence analysis of the proposed algorithm is presented, and a stability criterion for the algorithm … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
19
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(21 citation statements)
references
References 13 publications
(9 reference statements)
2
19
0
Order By: Relevance
“…In the third experiment, we demonstrate the performance when the input signal is a fragment of 2 s of real speech, sampled at 8kHZ [4,8]. Figure 9 shows an acoustic echo path of a 1024-tap system with 52 non-zero coefficients, which can be considered to be very sparse and is used in the simulation.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In the third experiment, we demonstrate the performance when the input signal is a fragment of 2 s of real speech, sampled at 8kHZ [4,8]. Figure 9 shows an acoustic echo path of a 1024-tap system with 52 non-zero coefficients, which can be considered to be very sparse and is used in the simulation.…”
Section: Methodsmentioning
confidence: 99%
“…In general, a sparse adaptive filtering algorithm can be derived by incorporating a sparsity penalty term (SPT), such as the l0-norm, into a traditional adaptive algorithm. Typical examples of sparse adaptive filtering algorithms include sparse least mean square (LMS) [1][2][3][4], sparse affine projection algorithms (APA) [5], sparse recursive least squares (RLS) [6], and their variations [7][8][9][10][11][12].…”
Section: Introductionmentioning
confidence: 99%
“…Finally, we set up an experiment to study the tracking behavior of our GZA-PNMCC algorithm for estimating a long-tap echo channel with two different sparsity levels and a length of 256. The sparsity measurement of the echo channel is ζ 12 [44][45][46][47][48]. A typical echo channel is described in Figure 6.…”
Section: Behavior Of the Proposed Gza-pnmcc Algorithmmentioning
confidence: 99%
“…The ZA algorithms cannot distinguish zero taps and nonzero taps, which exert ZA on all the filter coefficients and hence may degrade their performance. After that, zero-attracting techniques have been introduced into the proportionate adaptive filter algorithms [5,16,18], leaky least mean square [19] and normalized least mean square algorithms [7] to form desired zero attractors [6,11,17]. Their convergence characteristics are analyzed in [10,22].…”
Section: Introductionmentioning
confidence: 99%