2007
DOI: 10.1016/j.csda.2007.03.020
|View full text |Cite
|
Sign up to set email alerts
|

Kernel ellipsoidal trimming

Abstract: Ellipsoid estimation is an issue of primary importance in many practical areas such as control, system identification, visual/audio tracking, experimental design, data mining, robust statistics and novelty/outlier detection. This paper presents a new method of kernel information matrix ellipsoid estimation (KIMEE) that finds an ellipsoid in a kernel defined feature space based on a centered information matrix. Although the method is very general and can be applied to many of the aforementioned problems, the ma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2011
2011
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(14 citation statements)
references
References 30 publications
0
13
0
Order By: Relevance
“…All possible correlations were computed by testing the line that fits best the maximum number of participants following the concepts of Dolia et al [28]. We simplified the latter method by substituting ellipsoidal kernels by lines.…”
Section: Correlation Analysis and Statisticsmentioning
confidence: 99%
“…All possible correlations were computed by testing the line that fits best the maximum number of participants following the concepts of Dolia et al [28]. We simplified the latter method by substituting ellipsoidal kernels by lines.…”
Section: Correlation Analysis and Statisticsmentioning
confidence: 99%
“…Then we can write i α i x i x i + γI = X A 2 X + γI. Note that the matrices (AX) (AX) = X A 2 X and (AX)(AX) = AXX A = AKA have the same nonzero eigenvalues λ i , equal to the squares of the singular values of AX [2]. With d the dimensionality of the space and the number of data points x i , it is now easy to show that:…”
Section: Kernel Regularised Minimum Volume Covering Ellipsoidmentioning
confidence: 99%
“…We should be able to compute the Mahalanobis distance for a test point exclusively using kernel evaluations and the vector α. Recall the eigenvalue decompositions of i α i x i x i = X A 2 X = UΛU and AXX A = AKA = VΛV [2]. We then have that…”
Section: Kernel Regularised Minimum Volume Covering Ellipsoidmentioning
confidence: 99%
See 1 more Smart Citation
“…Pauwels and Ambekar [29] reformulate the cost function for the one-class SVM (OCSVM) so that the centre of the sphere is a weighted median of the support vectors, rather than the weighted mean of the support vectors. Dolia et al [30] use kernel ellipsoidal trimming where the outliers are removed from the training set and the algorithm rerun. Both OCSVM and kernel ellipsoidal trimming use the boundary for anomaly detection.…”
Section: Minimum Volume Elliptical Pcamentioning
confidence: 99%