2011 Conference Record of the Forty Fifth Asilomar Conference on Signals, Systems and Computers (ASILOMAR) 2011
DOI: 10.1109/acssc.2011.6190060
|View full text |Cite
|
Sign up to set email alerts
|

A modified non-negative LMS algorithm and its stochastic behavior analysis

Abstract: Abstract-Non-negativity is a constraint that can sometimes be imposed on parameters to estimate. Non-negative least-meansquare (NN-LMS) algorithm is an efficient online methods that finds adaptively solutions of a Wiener problem subject to such constraints over the filter weights. However, during the convergence of this algorithm, it has been observed that the weights may have unbalanced convergence rates. Small weights have much slower convergence rates compared with larger ones, which is usually an undesired… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 9 publications
0
6
0
Order By: Relevance
“…Therefore, the appropriate α can compensate distorted P(n) and provide good approximation for Tr(A 1 (n)P(n)). By using (19) and (20), (16) can be rewritten as Tr(P(n + 1)) ≃ Tr(P(n)) − 2m(n)aTr(P(n))…”
Section: Derivation Of Vsssmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, the appropriate α can compensate distorted P(n) and provide good approximation for Tr(A 1 (n)P(n)). By using (19) and (20), (16) can be rewritten as Tr(P(n + 1)) ≃ Tr(P(n)) − 2m(n)aTr(P(n))…”
Section: Derivation Of Vsssmentioning
confidence: 99%
“…In [15–17], the problem of online system parameter estimation under a non‐negativity constraint was investigated. A NN least‐mean‐square (NN‐LMS) algorithm [15] was proposed to solve the Wiener problem under the constraint that the resulting weights need to be NN.…”
Section: Introductionmentioning
confidence: 99%
“…To make that analysis tractable, we consider the case of input signal zero-mean and Gaussian [25], [30]. Using (22) with the appropriate subscript in (19) and , the SignSign NNLMS weight error update equation can be written as (36) Note that, unlike the former two variants, the non-stationarity effect appears in the weight error update (36) as a nonlinear function of . The th component of (36) is given by (37) To determine the expected value of (37), we first note that it has been demonstrated in [30], [31] is a nonlinear function and the distribution of its argument is unknown, we proceed as we did for the Exponential NNLMS algorithm and replace the nonlinear function by its zero-th order approximation Taking the expected value of (37), using the results (39) and (40) and expressing the result in vector form yields the mean weight error vector behavior model (42) where is the diagonal matrix with being the vector whose th entry is given by (40).…”
Section: Sign-sign Nnlms Algorithmmentioning
confidence: 99%
“…For the analyses that follow, we shall define the weight error vector with respect to the unconstrained solution as (22) and the weight error vector with respect to the mean unconstrained solution as…”
Section: Mean Weight Behaviormentioning
confidence: 99%
See 1 more Smart Citation