2009
DOI: 10.1007/s11265-009-0385-9
|View full text |Cite
|
Sign up to set email alerts
|

Convergence Behavior of NLMS Algorithm for Gaussian Inputs: Solutions Using Generalized Abelian Integral Functions and Step Size Selection

Abstract: This paper studies the mean and mean square convergence behaviors of the normalized least mean square (NLMS) algorithm with Gaussian inputs and additive white Gaussian noise. Using the Price's theorem and the framework proposed by Bershad in IEEE Transactions on Acoustics, Speech, and Signal Processing (1986, 1987), new expressions for the excess mean square error, stability bound and decoupled difference equations describing the mean and mean square convergence behaviors of the NLMS algorithm using the genera… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
13
0

Year Published

2009
2009
2015
2015

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 16 publications
(15 citation statements)
references
References 19 publications
(63 reference statements)
2
13
0
Order By: Relevance
“…Therefore, under the stated assumptions, the maximum convergence rate of the normalized algorithms using ATS is faster than the LMS-based algorithms if the eigenvalues are unequal. Similar conclusion is obtained for the conventional NLMS algorithm [37,46].…”
Section: Remarkssupporting
confidence: 84%
See 4 more Smart Citations
“…Therefore, under the stated assumptions, the maximum convergence rate of the normalized algorithms using ATS is faster than the LMS-based algorithms if the eigenvalues are unequal. Similar conclusion is obtained for the conventional NLMS algorithm [37,46].…”
Section: Remarkssupporting
confidence: 84%
“…(43) and (45) will reduce to the EMSE (∞) and stability bound for the conventional NLMS algorithm derived in [37]. For the NLMM algorithm using MHnonlinearity with a practical value of k x ¼ 2:576, A y , S y , and C y are quite close to one, and its performance is therefore similar to that of the conventional NLMS algorithm.…”
Section: Remarksmentioning
confidence: 79%
See 3 more Smart Citations