Least‐Mean‐Square Adaptive Filters 2003
DOI: 10.1002/0471461288.ch3
|View full text |Cite
|
Sign up to set email alerts
|

Energy Conservation and the Learning Ability of LMS Adaptive Filters

Abstract: This chapter provides an overview of interesting phenomena pertaining to the learning capabilities of stochastic-gradient adaptive filters, and in particular those of the least-mean-squares (LMS) algorithm. The phenomena indicate that the learning behavior of adaptive filters is more sophisticated, and also more favorable, than was previously thought, especially for larger step-sizes. The discussion relies on energy conservation arguments and elaborates on both the mean-square convergence and the almost-sure c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 10 publications
(19 reference statements)
0
6
0
Order By: Relevance
“…The evolution of the error s[n] = U H F x[n] in the compact transformed domain is totally equivalent to the behavior of x[n] from a mean-square error point of view. Thus, using energy conservation arguments [57], we consider a general weighted squared error sequence s…”
Section: B Mean-square Analysismentioning
confidence: 99%
“…The evolution of the error s[n] = U H F x[n] in the compact transformed domain is totally equivalent to the behavior of x[n] from a mean-square error point of view. Thus, using energy conservation arguments [57], we consider a general weighted squared error sequence s…”
Section: B Mean-square Analysismentioning
confidence: 99%
“…In particular, it is immediate to see that (48) can be obtained from ( 44)-(45), by substituting the terms p i in (46) with p 2 i , for the case i = j. Such approximation appears in (48) only in the term O(µ 2 ) and, under Assumption 3, it is assumed to a negligible deviation from (43). Now, we proceed by showing stability conditions for recursion (42).…”
Section: Discussionmentioning
confidence: 99%
“…(48) only in the term O(µ 2 ) and, under Assumption 3, it is assumed to produce a negligible deviation from (43). Now, we proceed by showing the stability conditions for recursion (42).…”
Section: Appendix a Proof Of Theoremmentioning
confidence: 99%
See 1 more Smart Citation
“…Subsequently, the mean-square performance of the algorithm is analyzed based on an energy conservation relation [104], [106], [107], [108], [110], [131], [132]. Stability bounds for the step size and an expression for the steady-state MSE are determined by manipulating various error quantities in the energy conservation relation.…”
Section: Chapter 5 Stability and Performance Analysismentioning
confidence: 99%