2011
DOI: 10.1016/j.neunet.2011.03.009
|View full text |Cite
|
Sign up to set email alerts
|

Design of a multiple kernel learning algorithm for LS-SVM by convex programming

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 30 publications
(13 citation statements)
references
References 15 publications
0
13
0
Order By: Relevance
“…G ¼ P M l¼1 a l K l ; a l P 0, [43,44] reconstruct QCQP as the Semi-Infinite Linear Program (SILP) which recycles the standard Support Vector Machine (SVM) implementations [5,11,21]. Moreover, Jian et al [24] address the issue of multiple kernels for Least Squares Support Vector Machine (LSSVM) by formulating a SemiDefinite Programming (SDP). Recently, researchers give some new formulations for multiple kernel optimization [15,45,34].…”
Section: Related Workmentioning
confidence: 99%
“…G ¼ P M l¼1 a l K l ; a l P 0, [43,44] reconstruct QCQP as the Semi-Infinite Linear Program (SILP) which recycles the standard Support Vector Machine (SVM) implementations [5,11,21]. Moreover, Jian et al [24] address the issue of multiple kernels for Least Squares Support Vector Machine (LSSVM) by formulating a SemiDefinite Programming (SDP). Recently, researchers give some new formulations for multiple kernel optimization [15,45,34].…”
Section: Related Workmentioning
confidence: 99%
“…Substituting by and moving it to the constraint, we get the following QCQP problem: (29) Equations (27) and (29) are the QCQP problems with complexity…”
Section: B Multiple Kernel For Ls-svmmentioning
confidence: 99%
“…Standard software such as MOSEK can solve them effectively. The obtained first dual variables from solving (29) can act as the optimal kernel coefficients [18], [40]. It is thus easily to obtain the optimal regularization parameter .…”
Section: B Multiple Kernel For Ls-svmmentioning
confidence: 99%
“…Many researchers have pointed out that three crucial problems existing in SVR urgently need to be addressed: (1) How to choose or construct an appropriate kernel to complete forecasting problems [8,9]; (2) How to optimize parameters of SVR to improve the quality of prediction [10,11]; (3) How to construct a fast algorithm to operate in presence of large datasets [12,13]. With unsuitable kernel functions or hyperparameter settings, SVR may lead to poor prediction results.…”
Section: Introductionmentioning
confidence: 99%