2007
DOI: 10.1007/s11063-007-9041-1
|View full text |Cite
|
Sign up to set email alerts
|

New Least Squares Support Vector Machines Based on Matrix Patterns

Abstract: Support vector machine (SVM), as an effective method in classification problems, tries to find the optimal hyperplane that maximizes the margin between two classes and can be obtained by solving a constrained optimization criterion using quadratic programming (QP). This QP leads to higher computational cost. Least squares support vector machine (LS-SVM), as a variant of SVM, tries to avoid the above shortcoming and obtain an analytical solution directly from solving a set of linear equations instead of QP. Bot… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
29
0

Year Published

2008
2008
2019
2019

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 88 publications
(30 citation statements)
references
References 13 publications
(20 reference statements)
1
29
0
Order By: Relevance
“…Different from [24], [25], this paper further gives why and when MatCD outperforms VecCD, which is validated by our experiments here.…”
Section: (B)supporting
confidence: 62%
See 2 more Smart Citations
“…Different from [24], [25], this paper further gives why and when MatCD outperforms VecCD, which is validated by our experiments here.…”
Section: (B)supporting
confidence: 62%
“…• For realizing MatCD, we employ our previous work [24], [25] that proposes a matrix-patternoriented classifier design. Different from [24], [25], this paper further gives why and when MatCD outperforms VecCD, which is validated by our experiments here.…”
Section: (B)mentioning
confidence: 99%
See 1 more Smart Citation
“…Chen et al developed a more general principal component analysis (MatPCA) to extract features based on matrix pattern [12]. Wang and Chen proposed a matrixized least squares support vector machine (MatLSSVM) that could directly classify the objects represented by matrix [13]. Furthermore, Wang et al improved MatLSSVM and proposed an efficient kernelized classifier named kernel-based matrixized least square support vector machine (KMatLSSVM) [14].…”
Section: Introductionmentioning
confidence: 99%
“…Thus, u and v can be obtained iteratively by solving (12) and (13). Consequently, the decision function of the kernelized MatOCSVM for input matrix data point X is formed as…”
mentioning
confidence: 99%