2008
DOI: 10.1016/j.jelechem.2008.06.021
|View full text |Cite
|
Sign up to set email alerts
|

Least-squares support vector machines for simultaneous voltammetric determination of lead and tin: A comparison between LS-SVM and PLS in voltammetric data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
16
0

Year Published

2010
2010
2014
2014

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 33 publications
(17 citation statements)
references
References 41 publications
1
16
0
Order By: Relevance
“…22,23 According to mobility data (Table 1), data classified to training and prediction sets according to KennardStones algorithm. The optimum number of factors to be included in the calibration model was determined by computing the prediction error sum of squares (PRESS) fro cross-validated models using a high number of factors (half of the number of total training set + 1).…”
Section: Partial Least Squares Analysismentioning
confidence: 99%
“…22,23 According to mobility data (Table 1), data classified to training and prediction sets according to KennardStones algorithm. The optimum number of factors to be included in the calibration model was determined by computing the prediction error sum of squares (PRESS) fro cross-validated models using a high number of factors (half of the number of total training set + 1).…”
Section: Partial Least Squares Analysismentioning
confidence: 99%
“…However, the kernel function more used is the radial basis function (RBF), exp À jjx i À x j jj 2 =2r 2 , a simple Gaussian function, and polynomial functions x i ; x j d , where r 2 is the width of the Gaussian function and d is the polynomial degree, which should be optimized by the user, to obtain the support vector. For r of the RBF kernel and d of the polynomial kernel it should be stressed that it is very important to do a careful model selection of the tuning parameters, in combination with the regularization constant c, in order to achieve a good generalization model [46]. The free LS-SVM toolbox (LS-SVM V-1.5, Suykens, Leuven, Belgium) was used with MATLAB Version 7.6 to derive all the LS-SVM models [45].…”
Section: Descriptors Calculationmentioning
confidence: 99%
“…When conducting the LS-SVM simulation, selection of the kernel function is a crucial problem need to be solved [23,24]. In principle, the LS-SVM always fits a linear relation (y = ωx + b) between the regression x and the dependent variable y.…”
Section: Kernel Function Selectionmentioning
confidence: 99%