Abstract:The maximum correntropy criterion (MCC) algorithm has attracted much attention due to its capability of combating impulsive noise. However, its performance depends on choice of the kernel width, which is a hard issue. Several variable kernel width schemes based on various error functions have been proposed to address this problem. Nevertheless, these methods may not provide an optimal kernel width because they do not contain any knowledge of the background noise that actually has influence on the optimization … Show more
“…The IVKW-MCC replaces σ in (5) with a time-varying kernel with σ (k ) [21] and obtains the design of the variable kernel width…”
Section: Review Of the Ivkw-mcc Algorithmmentioning
confidence: 99%
“…If the impulsive noise is the alpha-stable noise model, the δ 2 should be estimated by adopting the moving average strategy between the error power and the output-noise power in Ref. [21]. IVKW-MCC has done an outstanding job in terms of kernel width, and we now discuss the proposed variable step size algorithm.…”
Section: Review Of the Ivkw-mcc Algorithmmentioning
confidence: 99%
“…The MCC algorithm also can be written as w(k + 1) = w(k ) + μ (e(k ))e(k )u(k ) (20) where μ (e(k )) = μ(e(k )) exp − e 2 (k ) 2σ 2 (k ) . From the limitation of the step size factor in the LMS algorithm [20] 0 < μ (e(k )) < 2 3tr(R) (21) where tr(•) is the trace operator, and R is the autocorrelation matrix of the input vector and is given by…”
Section: Stability Analysismentioning
confidence: 99%
“…The unknown system vector w 0 changes to −w 0 at iteration 2 × 10 4 . GSNR = 1,3,5,7,9,11,13,15,17,19,21,23,25,27, 29…”
“…The IVKW-MCC replaces σ in (5) with a time-varying kernel with σ (k ) [21] and obtains the design of the variable kernel width…”
Section: Review Of the Ivkw-mcc Algorithmmentioning
confidence: 99%
“…If the impulsive noise is the alpha-stable noise model, the δ 2 should be estimated by adopting the moving average strategy between the error power and the output-noise power in Ref. [21]. IVKW-MCC has done an outstanding job in terms of kernel width, and we now discuss the proposed variable step size algorithm.…”
Section: Review Of the Ivkw-mcc Algorithmmentioning
confidence: 99%
“…The MCC algorithm also can be written as w(k + 1) = w(k ) + μ (e(k ))e(k )u(k ) (20) where μ (e(k )) = μ(e(k )) exp − e 2 (k ) 2σ 2 (k ) . From the limitation of the step size factor in the LMS algorithm [20] 0 < μ (e(k )) < 2 3tr(R) (21) where tr(•) is the trace operator, and R is the autocorrelation matrix of the input vector and is given by…”
Section: Stability Analysismentioning
confidence: 99%
“…The unknown system vector w 0 changes to −w 0 at iteration 2 × 10 4 . GSNR = 1,3,5,7,9,11,13,15,17,19,21,23,25,27, 29…”
“…Nevertheless, the performance surface of correntropy is markedly nonconvex, which may give rise to inferior convergence speed. Recent work in information theoretic learning [2][3][4][5][6][7] has shown that kernel risksensitive loss (KRSL) has a more convex performance surface, and the corresponding minimum KRSL (MKRSL) algorithm can produce a faster convergence speed and higher accuracy without impairing the robustness against outliers [4]. However, the MKRSL algorithm is limited to adaptive filtering of the real domain.…”
Similarity measures play a significant role in adaptive filtering. Previous work such as correntropy and kernel risk-sensitive loss (KRSL), has successfully improved the technology of adaptive filtering in terms of robustness against outliers, fast convergence speed and high filtering accuracy. Based on KRSL, a newly raised similarity measure, complex KRSL (CKRSL), was proposed by extending KRSL to the complex domain. It successfully gains superior performance than other similarity measures in adaptive filtering algorithms. However, the minimum CKRSL (MCKRSL) algorithm may result in poor performance when the parameters are not properly chosen. In this Letter, an adaptive parameter selection is proposed to help the MCKRSL algorithm improve performance while overcoming the uncertainty in artificial selection. The proposed MCKRSL with variable parameters (MCKRSL-VP) algorithm updates the risk-sensitive parameter and kernel width by making the iteratively squared bias as small as possible. A moving average scheme is further used to smoothly update the risk-sensitive parameter and kernel width. Finally, the authors verify that MCKRSL-VP performs better than other algorithms by simulations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.