2017
DOI: 10.1016/j.dsp.2017.04.001
|View full text |Cite
|
Sign up to set email alerts
|

Recursive inverse algorithm: Mean-square-error analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
18
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 24 publications
(19 citation statements)
references
References 18 publications
0
18
0
1
Order By: Relevance
“…), C( ) represents the filter weight vector calculated at time , v( ) = Wx( ) represents the transformed input signal and W represents the wavelet transform matrix of size × . ( ) represents the variable step-size [16] which satisfies the convergence criterion [9], the autocorrelation matrix R( ) represents the estimate of the tap-input vector, and p( ) represents the estimate of the cross-correlation vector between the desired output signal ( ) and the tap-input vector estimated, recursively, as:…”
Section: Dwt Second-order Recursive Inverse Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…), C( ) represents the filter weight vector calculated at time , v( ) = Wx( ) represents the transformed input signal and W represents the wavelet transform matrix of size × . ( ) represents the variable step-size [16] which satisfies the convergence criterion [9], the autocorrelation matrix R( ) represents the estimate of the tap-input vector, and p( ) represents the estimate of the cross-correlation vector between the desired output signal ( ) and the tap-input vector estimated, recursively, as:…”
Section: Dwt Second-order Recursive Inverse Algorithmmentioning
confidence: 99%
“…It has been shown that the RI algorithm performs significantly better than LMS algorithm and its variants. Also, its performance is very comparable to that of the RLS algorithm, in terms of convergence rate and excess mean square error (MSE) [16], in various settings, with less computational complexity. Futher improvement of the performance of the recursive inverse algorithm was achived by considering second-order estimation of the correlations in the update equation of the RI algorithm [17].…”
Section: Introductionmentioning
confidence: 99%
“…A possible solution to overcome this problem is to use a variable forgetting factor (VFF-RLS) algorithm [18,19]. Recursive inverse (RI) algorithm [20,21] had been proposed to overcome the drawbacks and limitations of the above mentioned adaptive filters. It had been shown that the RI algorithm performs considerably better than the LMS algorithm and its variants.…”
Section: Introductionmentioning
confidence: 99%
“…The recursive inverse (RI) algorithm [22,23] and the iterative Wiener filter (IWF) algorithm [24] have recently been proposed. RI and IWF have the same structure besides a step size calculation.…”
Section: Introductionmentioning
confidence: 99%
“…They perform similarly to the conventional RLS algorithm in terms of convergence and mean squared error, without using the inverse of the auto-covariance matrix. Therefore, RI [22,23] and IWF [24] can be considered algorithms without the numerical instability of RLS.…”
Section: Introductionmentioning
confidence: 99%