2004
DOI: 10.1016/j.neunet.2004.07.002
|View full text |Cite
|
Sign up to set email alerts
|

Fast exact leave-one-out cross-validation of sparse least-squares support vector machines

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
200
0
1

Year Published

2009
2009
2021
2021

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 282 publications
(202 citation statements)
references
References 23 publications
1
200
0
1
Order By: Relevance
“…The process is repeated for all the singletons in the training set. We use Allen's PRESS (predicted residual sum of squares) statistic for this process, P RESS = T i e 2 (i) [9], where e (i) = y i −ŷ (i) is the residual for the ith example with the ith example excluded from the training process andŷ (i) is the predicted response for the ith example based on the training process. Fortunately, we have e (i) = ei 1−hii where e i = y i −ŷ i is the residual for the ith example in the training process which includes all examples andŷ i is the fitted response based on this training.…”
Section: Kernel Ridge Regression As a Learning Methodsmentioning
confidence: 99%
“…The process is repeated for all the singletons in the training set. We use Allen's PRESS (predicted residual sum of squares) statistic for this process, P RESS = T i e 2 (i) [9], where e (i) = y i −ŷ (i) is the residual for the ith example with the ith example excluded from the training process andŷ (i) is the predicted response for the ith example based on the training process. Fortunately, we have e (i) = ei 1−hii where e i = y i −ŷ i is the residual for the ith example in the training process which includes all examples andŷ i is the fitted response based on this training.…”
Section: Kernel Ridge Regression As a Learning Methodsmentioning
confidence: 99%
“…K-fold cross-validation is used because it makes good use of the available data both as training and test data (Cawley & Talbot, 2004).…”
Section: Model Validationmentioning
confidence: 99%
“…To test the computational speed of the proposed CV method in practice, we make an experimental speed comparison between that and the fastest previously proposed CV approach for sparse RLS, the O(mn 2 ) time LOOCV algorithm proposed by Cawley and Talbot (2004). We only test LOOCV without removing the basis vectors, because the baseline method is defined only for that setting.…”
Section: Speed Comparisonsmentioning
confidence: 99%
“…Recently, Cawley and Talbot (2004) proposed this type of LOOCV algorithm. Its computational complexity of only O(mn 2 ) makes it much more practical than the LOOCV algorithm of standard RLS used together with the above mentioned modified kernel function, because it is as expensive as the training process of sparse RLS.…”
mentioning
confidence: 99%
See 1 more Smart Citation