2020
DOI: 10.1109/access.2019.2962861
|View full text |Cite
|
Sign up to set email alerts
|

The Bias-Compensated Proportionate NLMS Algorithm With Sparse Penalty Constraint

Abstract: For compensating the bias caused by the noisy input which is always ignored by ordinary algorithms, two novel algorithms with zero-attraction (ZA) penalties are proposed in this paper. The first one constructs a bias-compensated term in the updating recursion of the zero-attraction proportionate normalized least mean square (PNLMS) algorithm which is named BC-ZA-PNLMS algorithm. The second one employs the bias-compensated term and the correntropy induced metric (CIM) constraint to renew the updating recursion … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 57 publications
(47 reference statements)
0
4
0
Order By: Relevance
“…Yet, estimating the input noise power might be adversely affected by impulse noise present at the output of the unidentified system. The same trick that adopted CIM as the sparse penalty constraint is applied in [20], which is referred to as BC-CIM-PNLMS hereinafter. On the other hand, an L 0 norm cost function was used to accelerate the convergence for the sparse systems.…”
Section: Of 15mentioning
confidence: 99%
See 2 more Smart Citations
“…Yet, estimating the input noise power might be adversely affected by impulse noise present at the output of the unidentified system. The same trick that adopted CIM as the sparse penalty constraint is applied in [20], which is referred to as BC-CIM-PNLMS hereinafter. On the other hand, an L 0 norm cost function was used to accelerate the convergence for the sparse systems.…”
Section: Of 15mentioning
confidence: 99%
“…We plotted the NMSD learning curves and the evolution of mixing parameters in the simulation results by averaging over 100 independent Monte Carlo trials. The comparative works were BC-NLMS [19], BC-NLMF [21], BC-LMMN [22], BC-CIM-LGMN [23], and BC-CIM-PNLMS [20] algorithms.…”
Section: Setupmentioning
confidence: 99%
See 1 more Smart Citation
“…At present, classical algorithms employed against pulse noise include the deviation compensation algorithm [18], the symbolic algorithm [19], the logarithmic cost function algorithm [20], a series of algorithms based on the generalized maximum correntropy criterion [21], and the affine projection algorithm [22], among others. In [23], the affine projection generalized maximum correntropy filtering algorithm was proposed. This algorithm combines affine projection with generalized maximum correntropy ℓ2 for system identification in impulsive noise environments, offering improved filtering accuracy and faster convergence without computing the inverse of the input data matrix.…”
Section: Introductionmentioning
confidence: 99%