2014
DOI: 10.1016/j.knosys.2014.01.018
|View full text |Cite
|
Sign up to set email alerts
|

Training Lagrangian twin support vector regression via unconstrained convex minimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
9
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 44 publications
(9 citation statements)
references
References 27 publications
0
9
0
Order By: Relevance
“…In this section, four functions are employed to verify the feasibility of the ROGA-SVR [23,24]. The definitions of these functions are listed in Table I.…”
Section: Simulation Examplesmentioning
confidence: 99%
“…In this section, four functions are employed to verify the feasibility of the ROGA-SVR [23,24]. The definitions of these functions are listed in Table I.…”
Section: Simulation Examplesmentioning
confidence: 99%
“…Experimental results show the effectiveness of TWSVM over SVM and GEPSVM [22]. TWSVM takes O 1 4 m 3 operations which is 1/4 of the standard SVM, whereas GEPSVM takes O 1 4 n 3 . Here, m is the number of training examples; n is the dimensionality and m >> n [22,24].…”
Section: Introductionmentioning
confidence: 95%
“…However, Chapelle [ 47 ] observed that, by comparing the approximate efficiency of SVR in the primal space and the dual space, the approximate dual solution may not produce a good primal approximate solution. Some related work is directly solved in the primal space [ 48 , 49 , 50 , 51 ]. For example, inspired by the twin SVR and Newton methods, Balasundaram et al [ 49 ] proposed a new unconstrained Lagrangian TSVR (ULTSVR) to solve a pair of unconstrained minimization problems, thereby increasing the calculation speed.…”
Section: Introductionmentioning
confidence: 99%