2005
DOI: 10.1109/tnn.2004.841785
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Conjugate Gradient Scheme to the Solution of Least Squares SVM

Abstract: The least square support vector machines (LS-SVM) formulation corresponds to the solution of a linear system of equations. Several approaches to its numerical solutions have been proposed in the literature. In this letter, we propose an improved method to the numerical solution of LS-SVM and show that the problem can be solved using one reduced system of linear equations. Compared with the existing algorithm for LS-SVM, the approach used in this letter is about twice as efficient. Numerical results using the p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
43
0
1

Year Published

2005
2005
2020
2020

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 84 publications
(44 citation statements)
references
References 6 publications
0
43
0
1
Order By: Relevance
“…LS-SVM greatly simplifies the problem by characterizing the solution as a linear system. For large scale problems, linear system (5) can be efficiently solved using CG method [5], [6]. A drawback of LS-SVM is that the sparseness is lost in its solution: from the KKT conditions (4), every data point contributes to the model and the relative importance of a data point is reflected by its support value.…”
Section: Ls-svm Classifiersmentioning
confidence: 99%
See 2 more Smart Citations
“…LS-SVM greatly simplifies the problem by characterizing the solution as a linear system. For large scale problems, linear system (5) can be efficiently solved using CG method [5], [6]. A drawback of LS-SVM is that the sparseness is lost in its solution: from the KKT conditions (4), every data point contributes to the model and the relative importance of a data point is reflected by its support value.…”
Section: Ls-svm Classifiersmentioning
confidence: 99%
“…The radial basis function is used as the kernel function. For each dataset, we use the parameter suggested in [6] and [7]. We will compare the proposed pruning methods to CG methods in terms of both computational cost and classification accuracy.…”
Section: Numerical Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, Vapnic's SVM theory has been applied successfully for classification and regression problems [8][9][10]. SVM solves a constrained quadratic optimization problem, and it is based on statistical learning theory that gives the possibility to control the model's complexity and hence, to control its generalization ability.…”
Section: Introductionmentioning
confidence: 99%
“…오차 함수의 2차 미분을 이용한 방법으로는 Conjugate Gradient 알고리즘 [7,8], Newton 방법과 유사 Newton 방법인 LM(Levenberg-Marquardt) 방법 [9,10], Quasi-Newton 방법 [10,11]등이 있다. 이런 방법 모두 2차 도함수인 Hessian 행렬을 직접 구하는 대신 근사적인 방법으로 2차 도함수를 구 하여 학습에 적용함으로써 빠른 학습 속도를 보인다.…”
unclassified