2009 IEEE International Workshop on Machine Learning for Signal Processing 2009
DOI: 10.1109/mlsp.2009.5306191
|View full text |Cite
|
Sign up to set email alerts
|

Input space regularization stabilizes pre-images for kernel PCA de-noising

Abstract: Solution of the pre-image problem is key to efficient nonlinear de-noising using kernel Principal Component Analysis. Pre-image estimation is inherently ill-posed for typical kernels used in applications and consequently the most widely used estimation schemes lack stability. For de-noising applications we propose input space distance regularization as a stabilizer for pre-image estimation. We perform extensive experiments on the USPS digit modeling problem to evaluate the stability of three widely used pre-im… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2011
2011
2020
2020

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(12 citation statements)
references
References 13 publications
0
12
0
Order By: Relevance
“…Since the map Γ in (16) has no inverse, finding Υ * in X from τ * defines an ill-posed problem [70,71,[76][77][78]. In this setting, the determination of Υ * from τ * , requires the use of a P projector on τ 0 (20) in H, which yields a P (τ 0 ) element in H. If τ * lies in (or close to) the span of {Γ (Υ i )}, where Υ i is an i-th training data, Υ i ∈ X , from a training set S X of N training data,…”
Section: Connectivity Of the Objective Function In The Target Statementioning
confidence: 99%
“…Since the map Γ in (16) has no inverse, finding Υ * in X from τ * defines an ill-posed problem [70,71,[76][77][78]. In this setting, the determination of Υ * from τ * , requires the use of a P projector on τ 0 (20) in H, which yields a P (τ 0 ) element in H. If τ * lies in (or close to) the span of {Γ (Υ i )}, where Υ i is an i-th training data, Υ i ∈ X , from a training set S X of N training data,…”
Section: Connectivity Of the Objective Function In The Target Statementioning
confidence: 99%
“…We see that in kernel MAF analysis where we find one linear combination only, we need not regularize. Still we may wish to regularize and if so one of several possible alternative versions of the primal formulation is (20) which in the dual version becomes (21) which in turn kernelizes to (22) …”
Section: Regularization and Kernelizationmentioning
confidence: 99%
“…For the kernel MAF/MNF variates to be used in for example denoising we must look into the so-called preimage problem, [20]- [22]. This deals with the complicated problem of mapping back from the feature space defined implicitly by the kernel function to the original variable space.…”
mentioning
confidence: 99%
“…This helps to reduce the computational complexity of forming the decision boundary while maintaining high classification accuracy. Schölkopf et al 7,8,9,10,11 propose a kernel-based principle component analysis (PCA) to build the pre-images in feature space, which improves the performance of image denoising. Lillholm and Nielsen discuss the relation between an image and its features, 12, 13 define metamery classes of images and the corresponding information content of a canonical least informative representative of the class.…”
Section: Introductionmentioning
confidence: 99%