1998
DOI: 10.1016/s0893-6080(98)00032-x
|View full text |Cite
|
Sign up to set email alerts
|

The connection between regularization operators and support vector kernels

Abstract: In this paper a correspondence is derived between regularization operators used in regularization networks and support vector kernels. We prove that the Green's Functions associated with regularization operators are suitable support vector kernels with equivalent regularization properties. Moreover, the paper provides an analysis of currently used support vector kernels in the view of regularization theory and corresponding operators associated with the classes of both polynomial kernels and translation invari… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
328
0
7

Year Published

1999
1999
2023
2023

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 564 publications
(340 citation statements)
references
References 27 publications
(28 reference statements)
5
328
0
7
Order By: Relevance
“…Recently, Support Vector Machines (SVMs) have been introduced by Vapnik (Boser, Guyon, & Vapnik, 1992;Vapnik, 1995) for solving classification and nonlinear function estimation problems (Cristianini & Shawe-Taylor, 2000;Schölkopf et al, 1997;Schölkopf, Burges, & Smola, 1998;Smola, Schölkopf, & Müller, 1998;Smola, 1999;Suykens & Vandewalle, 1998;Vapnik, 1995Vapnik, , 1998aVapnik, , 1998b. Within this new approach the training problem is reformulated and represented in such a way so as to obtain a (convex) quadratic programming (QP) problem.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, Support Vector Machines (SVMs) have been introduced by Vapnik (Boser, Guyon, & Vapnik, 1992;Vapnik, 1995) for solving classification and nonlinear function estimation problems (Cristianini & Shawe-Taylor, 2000;Schölkopf et al, 1997;Schölkopf, Burges, & Smola, 1998;Smola, Schölkopf, & Müller, 1998;Smola, 1999;Suykens & Vandewalle, 1998;Vapnik, 1995Vapnik, , 1998aVapnik, , 1998b. Within this new approach the training problem is reformulated and represented in such a way so as to obtain a (convex) quadratic programming (QP) problem.…”
Section: Introductionmentioning
confidence: 99%
“…This is due to the fact that the capacity control performed by the SVM method is equivalent to some form of regularisation, so that "denoising" is not necessary [37]. In the case of KTA, the optimisation performed recognises directly the variables that do not report information about the labelling or that are very noisy.…”
Section: Filtering Non-informative Features For the Construction Of Tmentioning
confidence: 99%
“…) Let the Mercer kernel defined by k : X × X → R, and the regularisation operator Γ : Smola, Schölkopf, and Müller, 1998. ) Consider a kernel, endowed with translation invariance, namely k(x, ξ) …”
Section: Model Constructionmentioning
confidence: 99%