2021
DOI: 10.1109/tsmc.2019.2957269
|View full text |Cite
|
Sign up to set email alerts
|

Minimum Error Entropy Kalman Filter

Abstract: To date most linear and nonlinear Kalman filters (KFs) have been developed under the Gaussian assumption and the wellknown minimum mean square error (MMSE) criterion. In order to improve the robustness with respect to impulsive (or heavy-tailed) non-Gaussian noises, the maximum correntropy criterion (MCC) has recently been used to replace the MMSE criterion in developing several robust Kalman-type filters. To deal with more complicated non-Gaussian noises such as noises from multimodal distributions, in the pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
41
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 128 publications
(41 citation statements)
references
References 46 publications
0
41
0
Order By: Relevance
“…We should emphasize that the presented results do not imply that EnRDA always performs better than PF and EnKF. The EnKF at the limiting case M → ∞, in the absence of bias, is a minimum mean squared error estimator and attains the lowest possible posterior variance for linear systems, also referred to as the Cramér-Rao lower bound (Cramér, 1999;Rao et al, 1973). Thus, when the errors are drawn from zero mean Gaussian distributions with a linear observation operator, EnKF can outperform EnRDA in terms of the mean squared error.…”
Section: Experimental Setup and Resultsmentioning
confidence: 99%
“…We should emphasize that the presented results do not imply that EnRDA always performs better than PF and EnKF. The EnKF at the limiting case M → ∞, in the absence of bias, is a minimum mean squared error estimator and attains the lowest possible posterior variance for linear systems, also referred to as the Cramér-Rao lower bound (Cramér, 1999;Rao et al, 1973). Thus, when the errors are drawn from zero mean Gaussian distributions with a linear observation operator, EnKF can outperform EnRDA in terms of the mean squared error.…”
Section: Experimental Setup and Resultsmentioning
confidence: 99%
“…The kernel function in entropy is usually a Gaussian kernel due to its smoothness and strict positive determination. These properties showed the effectiveness of maximum correntropy criterion (MCC) for occlusion and corruption problems [28]- [32], [46], [47]. In particular, MCC is suited for dealing with impulsive noises.…”
Section: Introductionmentioning
confidence: 94%
“…Other strategies can be classified as entropy-based, distributionbased, or sliding mode type. Entropy-based strategies utilize a different criterion than the KF, which minimizes the wellknown mean square error (MSE) [14,15]. Both methods attempt to improve robustness to heavy-tailed non-Gaussian noises in order to provide a stable estimate.…”
Section: Introductionmentioning
confidence: 99%