2006
DOI: 10.1007/11679363_11
|View full text |Cite
|
Sign up to set email alerts
|

Estimating the Information Potential with the Fast Gauss Transform

Abstract: Abstract. In this paper, we propose a fast and accurate approximation to the information potential of Information Theoretic Learning (ITL) using the Fast Gauss Transform (FGT). We exemplify here the case of the Minimum Error Entropy criterion to train adaptive systems. The FGT reduces the complexity of the estimation from O(N 2 ) to O(pkN) where p is the order of the Hermite approximation and k the number of clusters utilized in FGT. Further, we show that FGT converges to the actual entropy value rapidly with … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2007
2007
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 8 publications
0
10
0
Order By: Relevance
“…Therefore, we define the test statistic , where (18) where the constant (19) The resulting classification rule therefore becomes , . We name the classifier we have derived the Laplacian classifier, because of the connection to the Laplacian matrix and its eigenspectrum (see also Section V-A).…”
Section: New Classification Rulementioning
confidence: 99%
See 2 more Smart Citations
“…Therefore, we define the test statistic , where (18) where the constant (19) The resulting classification rule therefore becomes , . We name the classifier we have derived the Laplacian classifier, because of the connection to the Laplacian matrix and its eigenspectrum (see also Section V-A).…”
Section: New Classification Rulementioning
confidence: 99%
“…Let be the outlier, and let denote its weight. The test statistic, (18), for class , with the outlier appended, is thus given by (23) The normalizing constant is expressed by the denominator of this expression. Let .…”
Section: Robustness Against Outliersmentioning
confidence: 99%
See 1 more Smart Citation
“…Although there are double summations in the correntropy coefficient, the computational complexity can be reduced to using the fast-Gauss transform [35]. However, the "narrowed" autocorrelation function increases computational complexity by including more delay terms.…”
Section: Pda Based On Correntropymentioning
confidence: 99%
“…Of course the computation of each entry of the matrix, the correntropy between components increases with the square of the number of samples. New fast techniques O(N ) to compute each entry have been developed [3].…”
Section: Simulationsmentioning
confidence: 99%