The Thrity-Seventh Asilomar Conference on Signals, Systems &Amp; Computers, 2003
DOI: 10.1109/acssc.2003.1292193
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian methods for sparse RLS adaptive filters

Abstract: This work deals with an extension of the standard recursive least squares (RLS) algorithm. It allows to prune irrelevant coefficients of a linear adaptive filter with sparse impulse response and it provides a regularization method with automatic adjustment of the regularization parameter. New update equations for the inverse auto-correlation matrix estimate are derived that account for the continuing shrinkage of the matrix size. In case of densely populated impulse responses of length M , the computational co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(15 citation statements)
references
References 4 publications
(1 reference statement)
0
15
0
Order By: Relevance
“…(28)- (31) imply an efficient way for the updating of M L ðnÞ from the previously computed matrix M L ðn À 1Þ, noting that (27) and the partition…”
Section: The Mipapamentioning
confidence: 99%
See 1 more Smart Citation
“…(28)- (31) imply an efficient way for the updating of M L ðnÞ from the previously computed matrix M L ðn À 1Þ, noting that (27) and the partition…”
Section: The Mipapamentioning
confidence: 99%
“…This task corresponds to the inverse matrix down-dating (i.e., extraction of the inverse which corresponds to a lower order partition of a matrix) and it can be easily tackled using the matrix inversion lemma (see, e.g., [31][32][33]). Application of the matrix inversion lemma for partitioned matrices ((4.160) in [2]) in (33) results in…”
Section: A Fast Exact Implementation Of the Mipapamentioning
confidence: 99%
“…Notice also that the parameters of all variational distributions are expressed in terms of expectations of expressions of the other parameters. This gives rise to a variational iterative scheme, which involves updating (16), (17), for , (19) and (25), for , in a sequential manner. Due to the convexity of the factors , and , the variational Bayes algorithm converges to a sparse solution in a few cycles, [15].…”
Section: A Batch Variational Bayes With a Student-t Priormentioning
confidence: 99%
“…Section II defines the mathematical formulation of the adaptive estimation 1 Note that a Bayesian approach to adaptive filtering has been previously proposed in [16]. However, in [16] a type-II maximum likelihood inference method is adopted that leads to a regularized RLS-type scheme. This is completely different from the approach and algorithms described in this work.…”
Section: Introductionmentioning
confidence: 99%
“…Other applications of the RVM algorithm comprise, for example, the determination of relevant weights in adaptive filters, where the complexity of the adaptive filters is also considerably reduced without impairing performance [38].…”
Section: Bayesian Regularization and Pruningmentioning
confidence: 99%