The 2006 IEEE International Joint Conference on Neural Network Proceedings 2006
DOI: 10.1109/ijcnn.2006.246906
|View full text |Cite
|
Sign up to set email alerts
|

Correntropy as a Novel Measure for Nonlinearity Tests

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2009
2009
2019
2019

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(19 citation statements)
references
References 11 publications
0
19
0
Order By: Relevance
“…A S a nonlinear robust similarity measure, correntropy has been successfully applied in various areas of signal processing and machine learning, such as robust regression and filtering [1]- [6], robust classification [7], robust principal component analysis [8], spectral characterization [9], nonlinearity tests [10], period estimation [11], pitch detection in speech [12], and many others. In particular, the maximum correntropy criterion (MCC) has been shown to be a rather robust adaptation principle for adaptive system training in presence of heavy-tailed non-Gaussian noises [1]- [6].…”
Section: Introductionmentioning
confidence: 99%
“…A S a nonlinear robust similarity measure, correntropy has been successfully applied in various areas of signal processing and machine learning, such as robust regression and filtering [1]- [6], robust classification [7], robust principal component analysis [8], spectral characterization [9], nonlinearity tests [10], period estimation [11], pitch detection in speech [12], and many others. In particular, the maximum correntropy criterion (MCC) has been shown to be a rather robust adaptation principle for adaptive system training in presence of heavy-tailed non-Gaussian noises [1]- [6].…”
Section: Introductionmentioning
confidence: 99%
“…Correntropy proposed by [27] is a kernel based similarity measure which non-linearly maps the input space into some higher dimensional feature space in which inner products are computed efficiently. Correntropy similarity measure includes the statistical distribution information and time structure of signals in a single measure.…”
Section: Context-based Methodsmentioning
confidence: 99%
“…Following the literature, e.g. [10], in this paper we use a Gaussian kernel with Silverman's rule. As our calculations of coh-entropy coefficient normalises both signals and by mean and standard deviation before evaluating # , Silverman's rule determines the width to be = 0.4.…”
Section: Coh-entropy Coefficientmentioning
confidence: 99%
“…In the non-stationary case, we followed the methodology in [3,10,[58][59][60] by changing the coupling strength with time. In particular, we switched the coupling strength of from = 0 (no coupling) to = 0.9 (tightly coupling) at & = 50 and back to = 0 at & = 150.…”
Section: Simulated Data -Hénon Mapsmentioning
confidence: 99%