2016
DOI: 10.1109/tnnls.2015.2430935
|View full text |Cite
|
Sign up to set email alerts
|

The Proximal Trajectory Algorithm in SVM Cross Validation

Abstract: We propose a bilevel cross-validation scheme for support vector machine (SVM) model selection based on the construction of the entire regularization path. Since such path is a particular case of the more general proximal trajectory concept from nonsmooth optimization, we propose for its construction an algorithm based on solving a finite number of structured linear programs. Our methodology, differently from other approaches, works directly on the primal form of SVM. Numerical results are presented on binary d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 28 publications
(9 citation statements)
references
References 27 publications
0
7
0
Order By: Relevance
“…[7]), an approach exhibiting both good generalisation capabilities and high computational efficiency due to only requiring the solution of a convex problem for the training phase. The latter characteristic allows to experiment with several different variants of the basic model in order to adapt it to different setting, see for example the recent works [2,3,6,9]. The main idea in the SVM technique is the introduction of the concept of "margin" in the strict separation of two sets of points by means of a hyperplane.…”
Section: The Modelmentioning
confidence: 99%
“…[7]), an approach exhibiting both good generalisation capabilities and high computational efficiency due to only requiring the solution of a convex problem for the training phase. The latter characteristic allows to experiment with several different variants of the basic model in order to adapt it to different setting, see for example the recent works [2,3,6,9]. The main idea in the SVM technique is the introduction of the concept of "margin" in the strict separation of two sets of points by means of a hyperplane.…”
Section: The Modelmentioning
confidence: 99%
“…In the table, only the cross-validation accuracies are mentioned. A training-test sample proportion of 30% and 10-fold cross validation was used in all cases [ 51 ]. For all models, grid search [ 52 ] with cross-validation was used to determine the optimal hyperparameters.…”
Section: Identification Of Error Patterns In Eye Gaze Datamentioning
confidence: 99%
“…Based on an assumption that environmental elements around Acer mono trees would have impacts on Acer mono sap exudation based on farmers' experiences of increasing Acer mono sap exudation according to big daily temperature range due to the osmotic pressure effects and drying and decreasing Acer mono sap exudation according to increased temperature, the present study designed an Acer mono sap exudation prediction model with a total of seven parameters: average temperature, high, low, daily temperature range, maximum humidity, minimum humidity, and precipitation according to four algorithms, i.e., Linear regression [26][27][28][29][30][31][32][33][34][35], SVM [36][37][38][39][40][41][42][43][44][45][46], ANN [47][48][49][50][51][52][53][54][55][56], and Random forest [57][58][59][60][61]. Linear regression predicts and classifies based on linear regression equations derived from the analysis of correlations between dependent and independent variables.…”
Section: Evaluation Of Acer Mono Sap Output Amount Prediction Modelmentioning
confidence: 99%
“…SVM was embodied with scikit-learn. Since Acer mono sap exudation was predicted in high dimensions with seven parameters, the RBF kernel that was efficient even in high dimensions was used to search for an optimal model [36][37][38][39][40][41][42][43][44][45][46].…”
Section: Support Vector Machine Modelmentioning
confidence: 99%