2016
DOI: 10.1016/j.dsp.2016.07.009
|View full text |Cite
|
Sign up to set email alerts
|

Diffusion maximum correntropy criterion algorithms for robust distributed estimation

Abstract: Abstract:Robust diffusion adaptive estimation algorithms based on the maximum correntropy criterion (MCC), including adaptation to combination MCC and combination to adaptation MCC, are developed to deal with the distributed estimation over network in impulsive (long-tailed) noise environments. The cost functions used in distributed estimation are in general based on the mean square error (MSE) criterion, which is desirable when the measurement noise is Gaussian. In non-Gaussian situations, such as the impulsi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
51
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 113 publications
(51 citation statements)
references
References 42 publications
(51 reference statements)
0
51
0
Order By: Relevance
“…Different from the conventional MSE criterion, MCC demonstrates the inherent robustness to outliers in adaptive filtering, that is, when the magnitude of the estimation error is small, just like MSE, MCC measures the L2 norm distance between the system outputs and the observations, otherwise, when the magnitude of the estimation error gets larger, MCC measure the L1 norm distance and eventually the L0 norm distance between the system outputs and the observations. The MCC criterion based cost function has been widely employed in PCA methods, state space filters, diffusion adaptive estimation, DOA estimation, etc [14][15][16][17]. In this paper, we apply MCC as a substitute for the conventional MSE criterion in the PAST algorithm.…”
Section: The Mcc-past Algorithmmentioning
confidence: 99%
“…Different from the conventional MSE criterion, MCC demonstrates the inherent robustness to outliers in adaptive filtering, that is, when the magnitude of the estimation error is small, just like MSE, MCC measures the L2 norm distance between the system outputs and the observations, otherwise, when the magnitude of the estimation error gets larger, MCC measure the L1 norm distance and eventually the L0 norm distance between the system outputs and the observations. The MCC criterion based cost function has been widely employed in PCA methods, state space filters, diffusion adaptive estimation, DOA estimation, etc [14][15][16][17]. In this paper, we apply MCC as a substitute for the conventional MSE criterion in the PAST algorithm.…”
Section: The Mcc-past Algorithmmentioning
confidence: 99%
“…Correntropy criterion is a generalized correlation function of 2 random variables, which measures the similarity between them. [19][20][21][22][23][24][25][26][27][28][29][30][31] For 2 random variable d k (i) and y k (i), we can define the correntropy correlation function as follows:…”
Section: Single-task Global Optimizationmentioning
confidence: 99%
“…Thus, when the combination coefficients are symmetric and convex such as Equation 23, we will find out that optimizing the global cost in view of individual costs (12) is equivalent to optimizing the global cost in view of local costs (27). Hence, we can represent the global optimization problem in terms of the local optimization problem.…”
Section: Multitask Local Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…16,17 One useful way to extract similarities among objectives is to formulate optimization problems based on information theoretic learning cost functions. [18][19][20][21][22][23][24][25][26][27][28][29][30] Maximum correntropy criterion (MCC) is one of the useful measures of similarity that has been considered in learning problems recently. 25 In this paper, we investigate multitask learning over adaptive networks under different situations that are contrary from the former works.…”
Section: Introductionmentioning
confidence: 99%