2020
DOI: 10.1016/j.jfranklin.2020.09.015
|View full text |Cite
|
Sign up to set email alerts
|

Sparsity-aware normalized subband adaptive filters with jointly optimized parameters

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(6 citation statements)
references
References 34 publications
0
6
0
Order By: Relevance
“…Remark 1: (18) clearly reveals that the proposed SA-RNSAF algorithm will outperform the RNSAF algorithm for identifying sparse systems, if and only if △(k) < 0 holds 2 . It follows that ρ(k) should satisfy the inequality…”
Section: B Adaptation Of the Sparsity Penalty Weightmentioning
confidence: 97%
See 2 more Smart Citations
“…Remark 1: (18) clearly reveals that the proposed SA-RNSAF algorithm will outperform the RNSAF algorithm for identifying sparse systems, if and only if △(k) < 0 holds 2 . It follows that ρ(k) should satisfy the inequality…”
Section: B Adaptation Of the Sparsity Penalty Weightmentioning
confidence: 97%
“…Remark 2: The proposed AOP-SA-RNSAF update generalizes different algorithms, depending on the choice of ϕ(e) in (4b) and f (w) in (3). In the literature, several robust criteria against impulsive noises [6], [7], [9], [10], [14] defined by ϕ(e) and sparsity-aware penalties [15], [16], [18], [19], [22] defined by f (w) have been studied, which can be applied in the AOP-SA-RNSAF. Nevertheless, this paper does not consider the effect of different choices of ϕ(e) and/or f (w), which is worth studying in future work.…”
Section: B Adaptation Of the Sparsity Penalty Weightmentioning
confidence: 99%
See 1 more Smart Citation
“…Remark 2: The proposed AOP-SA-RNSAF update generalizes different algorithms, depending on the choice of ϕ(e) in (4b) and f (w) in ( 3). In the literature, several robust criteria against impulsive noises [14], [9], [6], [7], [10] defined by ϕ(e) and sparsity-aware penalties [19], [16], [18], [15], [95] defined by f (w) have been studied, which can be applied in the AOP-SA-RNSAF. Nevertheless, this paper does not consider the effect of different choices of ϕ(e) and/or f (w), which is worth studying in future work.…”
Section: B Adaptation Of the Sparsity Penalty Weightmentioning
confidence: 99%
“…However, despite the lower computational complexity than the AP algorithm, the NSAF algorithms have better performance in terms of tracking speed owing to their self-whitening property. Moreover, some variants of the NSAF, such as sign subband adaptive filter (SSAF) and its variants [10]- [12], M-estimate NSAF [13], [14], sparsity-aware SSAF and NSAF [15], [16], and bias-compensated NSAF [17]- [19], have been proposed to improve performance.…”
Section: Introductionmentioning
confidence: 99%