2020
DOI: 10.1155/2020/3238129
|View full text |Cite
|
Sign up to set email alerts
|

Linear Twin Quadratic Surface Support Vector Regression

Abstract: Twin support vector regression (TSVR) generates two nonparallel hyperplanes by solving a pair of smaller-sized problems instead of a single larger-sized problem in the standard SVR. Due to its efficiency, TSVR is frequently applied in various areas. In this paper, we propose a totally new version of TSVR named Linear Twin Quadratic Surface Support Vector Regression (LTQSSVR), which directly uses two quadratic surfaces in the original space for regression. It is worth noting that our new approach not only avoid… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 29 publications
(36 reference statements)
0
3
0
Order By: Relevance
“…When the optimal solution u * is obtained by the above two updated iteration Formulas (34) and (35), the optimal solution w * of the optimization problem ( 28) is w * = w 0 + Fu * . Then, we summarize the process of finding the optimal solution A * , b * , c * of the optimization problem (18) in Algorithm 1.…”
Section: Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…When the optimal solution u * is obtained by the above two updated iteration Formulas (34) and (35), the optimal solution w * of the optimization problem ( 28) is w * = w 0 + Fu * . Then, we summarize the process of finding the optimal solution A * , b * , c * of the optimization problem (18) in Algorithm 1.…”
Section: Algorithmmentioning
confidence: 99%
“…After that, Bai et al [28] proposed the quadratic kernel-free least-squares support vector machine for target diseases' classification. Following these leading works, some scholars performed further studies, e.g., see [29][30][31][32][33][34] for the classification problem, [35] for the regression problem, and [36] for the cluster problem. The good performance of these methods demonstrates that the quadratic hypersurface is an effective method to flexibly capture the nonlinear structure of data.…”
Section: Introductionmentioning
confidence: 99%
“…For the regression problem, Ye et al [26,27] proposed two kernel-free nonlinear regression models, quadratic surface kernel-free least squares SVR (QLSSVR) and -kernel-free soft QSSVR ( -SQSSVR), respectively. Zhai et al [28] proposed a linear twin quadratic surface support vector regression. Zheng et al [29] developed a hybrid QSSVR and applied it to stock indices and price forecasting.…”
Section: Introductionmentioning
confidence: 99%