1997
DOI: 10.1109/82.644046
|View full text |Cite
|
Sign up to set email alerts
|

A new quasi-Newton adaptive filtering algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
52
0

Year Published

2003
2003
2019
2019

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 44 publications
(52 citation statements)
references
References 16 publications
0
52
0
Order By: Relevance
“…The QN algorithm proposed in [4] is a robust algorithm that was developed using a stochastic approach and is based on the rank-one quasi-Newton update of the Hessian matrix, i.e., satisfying the QN hereditary condition [7]. However, this algorithm can also be shown to minimize the following deterministic cost function:…”
Section: Qn Algorithm and The Minimum-disturbance Approachmentioning
confidence: 99%
See 2 more Smart Citations
“…The QN algorithm proposed in [4] is a robust algorithm that was developed using a stochastic approach and is based on the rank-one quasi-Newton update of the Hessian matrix, i.e., satisfying the QN hereditary condition [7]. However, this algorithm can also be shown to minimize the following deterministic cost function:…”
Section: Qn Algorithm and The Minimum-disturbance Approachmentioning
confidence: 99%
“…Robust RLS algorithm implementations with reduced computational complexity are usually based on QR decompositions, which are complex to implement and maintain [1]. There are other algorithms that have been developed based on known convex optimization methods, like the quasi-Newton (QN) [4] and interior point least squares (IPLS) algorithms [6]. These algorithms offer increased robustness at a cost of extra computational complexity, for they do not admit O(N ) implementations.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The model was used for analyzing the NLMS algorithm [12], and was shown to yield accurate results. The model was also successfully used to analyze the quasi-Newton (QN) [14] and the binormalized data-reusing LMS (BNDRLMS) [15] algorithms.…”
Section: Convergence Analysismentioning
confidence: 99%
“…For the MSE analysis, we assume that the vectors are excited in a discrete number of directions [12,14,15].…”
Section: Excess Mse For White Input Signalsmentioning
confidence: 99%