2019
DOI: 10.1007/s00521-018-03981-1
|View full text |Cite
|
Sign up to set email alerts
|

A novel support vector regression algorithm incorporated with prior knowledge and error compensation for small datasets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 14 publications
(3 citation statements)
references
References 25 publications
0
3
0
Order By: Relevance
“…where the coefficients w and b are the weight vector and bias term, respectively. This linear function can be constrained to the following optimization problem [10,11,52]. Using ε as the insensitive loss function, the corresponding SVR, which is called ε-SVR, can be represented as follows:…”
Section: Conflicts Of Interestmentioning
confidence: 99%
See 1 more Smart Citation
“…where the coefficients w and b are the weight vector and bias term, respectively. This linear function can be constrained to the following optimization problem [10,11,52]. Using ε as the insensitive loss function, the corresponding SVR, which is called ε-SVR, can be represented as follows:…”
Section: Conflicts Of Interestmentioning
confidence: 99%
“…Therefore, a surrogate modeling method has been developed rapidly over the last three decades as an alternative for computationally expensive simulations that consumes less time [1]. A wide variety of surrogate models have been used in engineering design, such as polynomial response surface (PRS) [2,3], Kriging (KRG) [3][4][5][6], radial basis function (RBF) [7,8], and support vector regression (SVR) [8][9][10][11][12][13]. The PRS and SVR models can identify global trends for a given input data set; whereas, owing to the interpolation characteristics, KRG and RBF have higher local accuracy around the training points.…”
Section: Introductionmentioning
confidence: 99%
“…The antenna theories and Feko are used for the calculation of power pattern with position errors. 12 Prediction models are built based on a linear programming support vector regression (LPSVR) [13][14][15] algorithm. Section 4 presents the evaluation results on measured data and Section 5 reports our conclusions.…”
Section: Introductionmentioning
confidence: 99%