2004
DOI: 10.1023/b:mach.0000008082.80494.e0
|View full text |Cite
|
Sign up to set email alerts
|

Benchmarking Least Squares Support Vector Machine Classifiers

Abstract: Abstract. In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LS-SVMs), a least squares cost function is proposed so as to obtain a linear set of equations in the dual space. While the SVM classifier has a large margin interpretation, the LS-SVM formulation is related in this paper to a ridge regression approach for classification with binary targets … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
353
0
3

Year Published

2004
2004
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 655 publications
(359 citation statements)
references
References 54 publications
3
353
0
3
Order By: Relevance
“…a dispatcher) who uses a combination of ticket description and knowledge of component dependencies to identify potential faulty components. Supervised learning techniques such as Support Vector Machine (SVM) [15] may also be used to suggest for new tickets the likelihood (represented by a probability distribution) of each component being the source of the problem [2]. A combination involving a human dispatcher being assisted by an automated agent is also possible.…”
Section: System Modelmentioning
confidence: 99%
“…a dispatcher) who uses a combination of ticket description and knowledge of component dependencies to identify potential faulty components. Supervised learning techniques such as Support Vector Machine (SVM) [15] may also be used to suggest for new tickets the likelihood (represented by a probability distribution) of each component being the source of the problem [2]. A combination involving a human dispatcher being assisted by an automated agent is also possible.…”
Section: System Modelmentioning
confidence: 99%
“…Since these results are well-known in the literature, we give here a small review: multiclass schemes such as OvO, OvA, and DAG, have essentially the same accuracy as single-machine schemes. The LSM performs just as well as SVM, as illustrated in many studies [3,19] including [20] with binary and multiclass classification. Therefore, a simple scheme such as OvA (for SVM or LSM) is preferable to a more complicated single-machine or error-correcting coding scheme.…”
Section: Introductionmentioning
confidence: 82%
“…On the tic-tac-toe dataset, the MILP-E aggregated classifier overfits the design dataset though not dramatically. Previous studies [12] suggest that the tic-tac-toe dataset is highly irregular. Also on the handwritten digits dataset, there is some overfitting but it is not problematic.…”
Section: Optical Recognition Of Handwritten Digitsmentioning
confidence: 93%