This chapter provides an overview of interesting phenomena pertaining to the learning capabilities of stochastic-gradient adaptive filters, and in particular those of the least-mean-squares (LMS) algorithm. The phenomena indicate that the learning behavior of adaptive filters is more sophisticated, and also more favorable, than was previously thought, especially for larger step-sizes. The discussion relies on energy conservation arguments and elaborates on both the mean-square convergence and the almost-sure convergence behavior of an adaptive filter.
INTRODUCTIONAdaptive filters are prominent examples of systems that are designed to adjust to variations in their environments in order to meet certain performance criteria. The