2016
DOI: 10.1016/j.neunet.2015.10.007
|View full text |Cite
|
Sign up to set email alerts
|

TWSVR: Regression via Twin Support Vector Machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0
1

Year Published

2016
2016
2022
2022

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 64 publications
(13 citation statements)
references
References 6 publications
0
12
0
1
Order By: Relevance
“…A SVR principle was used to establish a regression model via considering the combination of ΔTEAC DPPH , ΔTEAC ABTS and ΔTEAC FRAP as independent variables and predict the inhibition/promotion rate of acrylamide formation. The standard epsilon (ξ) insensitive SVR model used in the present work sets an ξ tube around data points within which errors are discarded using an ξ insensitive loss function40. Under such situation, the selection criteria for an optimized SVR model mainly include the parameter optimization of cost of ξ-SVR ( c ) and γ-function of radial basis function (RBF) ( g ).…”
Section: Resultsmentioning
confidence: 99%
“…A SVR principle was used to establish a regression model via considering the combination of ΔTEAC DPPH , ΔTEAC ABTS and ΔTEAC FRAP as independent variables and predict the inhibition/promotion rate of acrylamide formation. The standard epsilon (ξ) insensitive SVR model used in the present work sets an ξ tube around data points within which errors are discarded using an ξ insensitive loss function40. Under such situation, the selection criteria for an optimized SVR model mainly include the parameter optimization of cost of ξ-SVR ( c ) and γ-function of radial basis function (RBF) ( g ).…”
Section: Resultsmentioning
confidence: 99%
“…With these, we seek to give the surrogate model the greatest possibility to obtain a suitable performance with the minimum number of plant evaluations. In our implementation, we selected a set of three machine learning algorithms: a support vector machine regressor (SVM) [ 48 ], a random forest regressor (RF) [ 49 ], and a Bayesian Ridge model (BR) [ 50 ], as shown in Equation ( 13 ), although this methodology can be generalized to any set of machine learning algorithms.…”
Section: Proposed Methods (Asams)mentioning
confidence: 99%
“…Zhao et al [ 37 ] extended the concept of twin hyperplanes, and combined the advantages of least squares support vector regression (LSSVR) to generate the estimated regressor, called Twin Least Squares Support Vector Regression (TLSSVR). By observing the model of Peng [ 34 ], Khemchandani et al [ 38 ] believed that only the principle of empirical risk minimization was considered in TSVR. To overcome these difficulties, Shao et al [ 39 ] proposed another twin regression model, called -TSVR, which considers the principle of structural risk minimization.…”
Section: Introductionmentioning
confidence: 99%