2016
DOI: 10.1002/asjc.1448
|View full text |Cite
|
Sign up to set email alerts
|

Recursive Generalized Maximum Correntropy Criterion Algorithm with Sparse Penalty Constraints for System Identification

Abstract: To address the sparse system identification problem in a non‐Gaussian impulsive noise environment, the recursive generalized maximum correntropy criterion (RGMCC) algorithm with sparse penalty constraints is proposed to combat impulsive‐inducing instability. Specifically, a recursive algorithm based on the generalized correntropy with a forgetting factor of error is developed to improve the performance of the sparsity aware maximum correntropy criterion algorithms by achieving a robust steady‐state error. Cons… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 19 publications
(11 citation statements)
references
References 33 publications
0
11
0
Order By: Relevance
“…Correntropy, as a measure of local similarity that is insensitive to outliers and noise, has been successfully applied in fields such as pattern recognition [38], machine learning [39] and signal processing [40]. Correntropy [22,[41][42][43] can be expressed as follows…”
Section: Correntropymentioning
confidence: 99%
“…Correntropy, as a measure of local similarity that is insensitive to outliers and noise, has been successfully applied in fields such as pattern recognition [38], machine learning [39] and signal processing [40]. Correntropy [22,[41][42][43] can be expressed as follows…”
Section: Correntropymentioning
confidence: 99%
“…To estimate the channel in nonGaussian noise, we define an MCC cost function with exponential forgetting factor λ shown in (1) [20,21] and minimize it adaptively.…”
Section: L 1 -Iwf Formulationmentioning
confidence: 99%
“…Recently, the maximum correntropy criterion (MCC) [16][17][18][19] has been successfully applied to various adaptive algorithms robust to impulsive noise. Current studies in robust sparse adaptive methods have resulted in the development of CR-RLS-based algorithms with MCC [20,21], and showed strong robustness under impulsive noise. However, CR-RLS used in [20,21] is not practical when determining the regularization coefficient for the sparse regularization term because CR-RLS [6] needs information about the true channel when calculating the regularization coefficients.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…It applies a kernel trick that nonlinearly maps the original space to a higher dimensional feature space. It can be shown that correntropy is directly related to the probability of how similar two random variables are in a neighborhood of the joint space controlled by the kernel bandwidth [17,25,26].…”
Section: Similarity Measures In Kernel Spacementioning
confidence: 99%