2011
DOI: 10.1142/s0219691311004031
|View full text |Cite
|
Sign up to set email alerts
|

ALGORITHM OF Ε-SVR BASED ON a LARGE-SCALE SAMPLE SET: STEP-BY-STEP SEARCH

Abstract: In view of the support vectors of ε-SVR that are not distributed in the ε belt and only located on the outskirts of the ε belt, a novel algorithm to construct ε-SVR of a large-scale training sample set is proposed in this paper. It computes firstly the ε-SVR hyper-plane of a small training sample set and the distances d of all samples to the hyperplane, then deletes the samples not in field ε ≤ d ≤ dmax and searches SVs gradually in the scope ε ≤ d ≤ dmax, and trains step-by-step the final ε-SVR. Finally, it a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…We then use the weight vector and the bias term to find the in case of linear vector. In Algorithm 3, we first train the SVM using svmtrain () function using LIBSVM whose time complexity is , where n represents the number of instances in a data set [ 12 ], and then we use [ 9 ] to implement Step 7 in Algorithm 3, whose time complexity is again [ 48 ]. Therefore, the overall time complexity of the Algorithm 3 is .…”
Section: -Tsvm: a Robust Transductive Support Vector Machine With Truncated Pinball Loss Functionmentioning
confidence: 99%
See 1 more Smart Citation
“…We then use the weight vector and the bias term to find the in case of linear vector. In Algorithm 3, we first train the SVM using svmtrain () function using LIBSVM whose time complexity is , where n represents the number of instances in a data set [ 12 ], and then we use [ 9 ] to implement Step 7 in Algorithm 3, whose time complexity is again [ 48 ]. Therefore, the overall time complexity of the Algorithm 3 is .…”
Section: -Tsvm: a Robust Transductive Support Vector Machine With Truncated Pinball Loss Functionmentioning
confidence: 99%
“…We observe that the dual form of -TSVM perform better than rest of the techniques on most data sets. Note that for SVM and TSVM, we implement the dual forms of these techniques for small data sets only as the computational time depends on the number of examples [ 48 ], so we cannot use it for large-scale data sets. We implement the primal form using SGD (Algorithm 1) for large real-world data sets.…”
Section: Numerical Experimentsmentioning
confidence: 99%