2004
DOI: 10.1109/tsp.2003.820077
|View full text |Cite
|
Sign up to set email alerts
|

Mean-Square Performance of a Family of Affine Projection Algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
233
0
2

Year Published

2009
2009
2017
2017

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 292 publications
(237 citation statements)
references
References 13 publications
2
233
0
2
Order By: Relevance
“…The expectation of w(k) 2 Σ ′ is difficult to calculate because of the dependency of Σ ′ on C(k) , Z(k), X(k), and ofw(k) on prior regressors. To solve this problem, we need to use the following independence assumptions [14]:…”
Section: Appendix B Mean-square Stability Analysis Of the Family Of Smentioning
confidence: 99%
See 1 more Smart Citation
“…The expectation of w(k) 2 Σ ′ is difficult to calculate because of the dependency of Σ ′ on C(k) , Z(k), X(k), and ofw(k) on prior regressors. To solve this problem, we need to use the following independence assumptions [14]:…”
Section: Appendix B Mean-square Stability Analysis Of the Family Of Smentioning
confidence: 99%
“…Equation (64) is stable if the matrix G is stable [14]. From (66), the convergence to the mean of the adaptive algorithm in (44) is guaranteed for any µ that satisfies:…”
Section: W(k) Is Independent Of D T (K)x T (K)mentioning
confidence: 99%
“…Equation is stable if the matrix G is stable [4]. From (61), we know that 2 =I-μ +μ GMN , where M=E{(n)(n)} XD I+IE{(n)(n)} ⊗⊗ XD…”
Section: Appendix B Mean and Mean-square Stability Analysis Of Thementioning
confidence: 99%
“…In the presence of colored input signals, the LMS and NLMS algorithms have extremely slow convergence rates. To solve this problem a number of adaptive filtering structures based on affine subspace projections [3], [4], [5], data reusing adaptive algorithms [6], [7], [8], Block adaptive filters [2] and multirate techniques have been proposed in the literature [9], [10], [11]. In all these algorithms the selected fixed step-size can change the convergence speed and the steady-state mean square error.…”
Section: Introductionmentioning
confidence: 99%
“…APA is a useful family of adaptive filters whose main purpose is to accelerate the convergence of LMS-type filters, especially for correlated data at a computational cost that is comparable to that of LMS. While LMS updates the weights based on the current input vector and APA updates the weights based on previous input vectors [5]. The algorithm applies to update directions that are orthogonal to the last p input vectors and thus allows decorrelation of an input process, speeding up the convergence [2].…”
Section: Introductionmentioning
confidence: 99%