2012
DOI: 10.1109/lsp.2012.2204435
|View full text |Cite
|
Sign up to set email alerts
|

Maximum Correntropy Estimation Is a Smoothed MAP Estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
136
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
8
1

Relationship

4
5

Authors

Journals

citations
Cited by 232 publications
(137 citation statements)
references
References 11 publications
1
136
0
Order By: Relevance
“…Given two random variables X and Y , the correntropy is defined by [19,28] V( , ) There is a well-known generalization of Gaussian density function called the generalized Gaussian density (GGD) function, which with zero-mean is given by [26,27]   ,, In this work, we use the GGD density function as the kernel function of correntropy, and define , , ,…”
Section: A Definitionmentioning
confidence: 99%
“…Given two random variables X and Y , the correntropy is defined by [19,28] V( , ) There is a well-known generalization of Gaussian density function called the generalized Gaussian density (GGD) function, which with zero-mean is given by [26,27]   ,, In this work, we use the GGD density function as the kernel function of correntropy, and define , , ,…”
Section: A Definitionmentioning
confidence: 99%
“…Remark: As one can see, when using the Gaussian kernel function, correntropy contains all even-order sums of random variables X and Y [12,14]. By using the above index, the MCC contains the high-order moment between the real value and the predicted value, and it is mostly guided by the local similarity of the data.…”
Section: Maximum Correntropy Criterionmentioning
confidence: 99%
“…is optimal only if the probability distribution function of the prediction errors is Gaussian [11,12]. In order to deal with the non-Gaussian and nonlinear problems in engineering applications, a novel criterion, namely, the maximum correntropy criterion (MCC), was introduced in [12,13], and its properties are discussed in [14,15]. Because the prediction errors under the small-sample electricity data have non-Gaussian statistical characteristics, the LSSVM prediction mechanism using the MCC is developed in this paper.…”
Section: Introductionmentioning
confidence: 99%
“…However, these algorithms still cannot give the expected convergence achieved from the PNLMS at the beginning iteration stage for high sparseness impulse response applications. Recently, an information theoretic quantity has been presented to build the required cost function for adaptive signal processing applications [32][33][34][35][36][37][38]. To calculate the quantities, the entropy estimator was reported in detail.…”
Section: Introductionmentioning
confidence: 99%