2015
DOI: 10.1007/s10994-015-5540-x
|View full text |Cite
|
Sign up to set email alerts
|

Tikhonov, Ivanov and Morozov regularization for support vector machine learning

Abstract: Learning according to the structural risk minimization principle can be naturally expressed as an Ivanov regularization problem. Vapnik himself pointed out this connection, when deriving an actual learning algorithm from this principle, like the well-known support vector machine, but quickly suggested to resort to a Tikhonov regularization schema, instead. This was, at that time, the best choice because the corresponding optimization problem is easier to solve and in any case, under certain hypothesis, the sol… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
26
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 54 publications
(28 citation statements)
references
References 52 publications
0
26
0
Order By: Relevance
“…The learning process and model selection phase are two important aspects of the SVM algorithm. Reference [21] "Tikhonov, Ivanov and Morozov regularization for Support Vector Machine Learning" introduces the learning method of training support vector machine in consideration of structural risk minimization, comparing the advantages and disadvantages of three kinds of regularization algorithm; this paper chooses the appropriate regularization algorithm to finish the learning process of SVM based on achievements of [21] for achieving unification of the algorithm effectiveness and operability. Reference [22] "In-Sample and Out-of-Sample Model Selection and Error Estimation for Support Vector Machines" details common methods of SVM model selection phase, introducing the difference between insample and out-of-sample and application condition of the two methods.…”
Section: The Basic Idea Of Support Vector Machinementioning
confidence: 99%
See 1 more Smart Citation
“…The learning process and model selection phase are two important aspects of the SVM algorithm. Reference [21] "Tikhonov, Ivanov and Morozov regularization for Support Vector Machine Learning" introduces the learning method of training support vector machine in consideration of structural risk minimization, comparing the advantages and disadvantages of three kinds of regularization algorithm; this paper chooses the appropriate regularization algorithm to finish the learning process of SVM based on achievements of [21] for achieving unification of the algorithm effectiveness and operability. Reference [22] "In-Sample and Out-of-Sample Model Selection and Error Estimation for Support Vector Machines" details common methods of SVM model selection phase, introducing the difference between insample and out-of-sample and application condition of the two methods.…”
Section: The Basic Idea Of Support Vector Machinementioning
confidence: 99%
“…Radical type (71.3%) 46 (radical type) 11 Common type (70.9%) 61 (common type) 21 Radical type (71.1%) 43 (radical type) 39…”
Section: Model Verificationmentioning
confidence: 99%
“…where A : R m×n → R d is a linear functional taking unknown parameters x ∈ R m×n to observations b ∈ R d . Problem (1) is also known as a Morozov formulation (in contrast to Ivanov or Tikhonov [17]). The functional A can include a transformation to another domain, including Wavelets, Fourier, or Curvelet coefficients [7], as well as compositions of these transforms with other linear operators such as restriction in interpolation problems.…”
Section: Introductionmentioning
confidence: 99%
“…In order to handle this kind of problem, the application of Support Vector Machines (SVMs) is a popular choice in the machine-learning research area since these are learning machines that implement the structural-riskminimization inductive principle to obtain good generalization on a limited number of learning patterns [25,26,22]. The theory of SVMs was developed on the basis of a separable binary classification problem where the optimization criterion is the width of the margin with 2 -norm 1 , between the positive and negative examples.…”
Section: Introductionmentioning
confidence: 99%