The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2020
DOI: 10.1109/tcsii.2018.2880564
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Variable Kernel Width for Maximum Correntropy Criterion Algorithm

Abstract: The maximum correntropy criterion (MCC) algorithm has attracted much attention due to its capability of combating impulsive noise. However, its performance depends on choice of the kernel width, which is a hard issue. Several variable kernel width schemes based on various error functions have been proposed to address this problem. Nevertheless, these methods may not provide an optimal kernel width because they do not contain any knowledge of the background noise that actually has influence on the optimization … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
18
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 42 publications
(18 citation statements)
references
References 23 publications
0
18
0
Order By: Relevance
“…The IVKW-MCC replaces σ in (5) with a time-varying kernel with σ (k ) [21] and obtains the design of the variable kernel width…”
Section: Review Of the Ivkw-mcc Algorithmmentioning
confidence: 99%
See 3 more Smart Citations
“…The IVKW-MCC replaces σ in (5) with a time-varying kernel with σ (k ) [21] and obtains the design of the variable kernel width…”
Section: Review Of the Ivkw-mcc Algorithmmentioning
confidence: 99%
“…If the impulsive noise is the alpha-stable noise model, the δ 2 should be estimated by adopting the moving average strategy between the error power and the output-noise power in Ref. [21]. IVKW-MCC has done an outstanding job in terms of kernel width, and we now discuss the proposed variable step size algorithm.…”
Section: Review Of the Ivkw-mcc Algorithmmentioning
confidence: 99%
See 2 more Smart Citations
“…Nevertheless, the performance surface of correntropy is markedly nonconvex, which may give rise to inferior convergence speed. Recent work in information theoretic learning [2][3][4][5][6][7] has shown that kernel risksensitive loss (KRSL) has a more convex performance surface, and the corresponding minimum KRSL (MKRSL) algorithm can produce a faster convergence speed and higher accuracy without impairing the robustness against outliers [4]. However, the MKRSL algorithm is limited to adaptive filtering of the real domain.…”
mentioning
confidence: 99%