2009 International Joint Conference on Neural Networks 2009
DOI: 10.1109/ijcnn.2009.5178701
|View full text |Cite
|
Sign up to set email alerts
|

The Multiple Pairs SMO: A modified SMO algorithm for the acceleration of the SVM training

Abstract: The Sequential Minimal Optimization (SMO) algorithm is known to be one of the most efficient solutions for the Support Vector Machine training phase. It solves a quadratic programming (QP) problem by optimizing a set of coefficients whose size is the number of training examples. However, its execution time may be quite long due to its computational complexity: the algorithm executes many calculations per iteration as well as many iterations until a stop criterion is satisfied. Due to its importance, many impro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 10 publications
0
2
0
Order By: Relevance
“…To improve convergence, more than two variables are optimized at a time [10], [11], [12]. In [12] SMO-NM was proposed, in which SMO and Newton's method are fused.…”
Section: Introductionmentioning
confidence: 99%
“…To improve convergence, more than two variables are optimized at a time [10], [11], [12]. In [12] SMO-NM was proposed, in which SMO and Newton's method are fused.…”
Section: Introductionmentioning
confidence: 99%
“…To improve convergence, more than two variables are optimized at a time [22][23][24][25][26][27]. In [22], q (≥ 2) modifiable variables are selected in the steepest ascent direction and are optimized.…”
mentioning
confidence: 99%