2017
DOI: 10.2298/fil1708195t
|View full text |Cite
|
Sign up to set email alerts
|

Smooth twin support vector machines via unconstrained convex minimization

Abstract: Twin support vector machine (TWSVM) exhibits fast training speed with better classification abilities compared with standard SVM. However, it suffers the following drawbacks: (i) the objective functions of TWSVM are comprised of empirical risk and thus may suffer from overfitting and suboptimal solution in some cases. (ii) a convex quadratic programming problems (QPPs) need to be solve, which is relatively complex to implement. To address these problems, we proposed two smoothing approaches for an implicit Lag… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 15 publications
(4 citation statements)
references
References 34 publications
0
4
0
Order By: Relevance
“…Support Vector Machine (SVM) is chosen as the classifier for the experiments. SVM obtains the minimum structural risk to improve the ability of model learning and generalization by searching for the maximum margin hyperplane (see, e.g., Vapnik, 2006;Tsochantaridis, 2005;Joachims, 2006Joachims, , 2009Chapelle, 2012;Swaminathan, 2015;Tanveer, 2017). Performance evaluation of the classifier on the dataset includes the precision, recall, and F-measure using the confusion matrix given in Table 1.…”
Section: Resultsmentioning
confidence: 99%
“…Support Vector Machine (SVM) is chosen as the classifier for the experiments. SVM obtains the minimum structural risk to improve the ability of model learning and generalization by searching for the maximum margin hyperplane (see, e.g., Vapnik, 2006;Tsochantaridis, 2005;Joachims, 2006Joachims, , 2009Chapelle, 2012;Swaminathan, 2015;Tanveer, 2017). Performance evaluation of the classifier on the dataset includes the precision, recall, and F-measure using the confusion matrix given in Table 1.…”
Section: Resultsmentioning
confidence: 99%
“…Tanveer [175] also proposed an implicit Lagrangian TSVM which is solved by using finite Newton method. Tanveer and Shubham [190] in 2017 proposed smooth TSVM via UMP which increases the generalization ability and training speed of TSVM.…”
Section: Fuzzy Twin Support Vector Machinesmentioning
confidence: 99%
“…Recently, Tanveer, Sharma, and Suganthan (2019) proposed a novel general TSVM with pinball loss (Pin-GTSVM) that is insensitive to noise and performs better for noise corrupted datasets. To retain the sparsity in the Pin-GTSVM, Tanveer, Tiwari, Choudhary, and Jalan (2019) proposed a novel sparse pinball twin SVM (SPTSVM) that is insensitive to outliers and retail sparsity. Richhariya and Tanveer (2020) used universum learning for the first time to solve the class imbalance problem in which reduced kernel was incorporated for reducing storage and computation cost.…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, concepts based on constructing a different variation of non‐parallel hyperplanes as in Tanveer (2015c), Tanveer and Shubham (2017), and Tanveer (2015a)) have emerged. TSVM is an efficient classification method, and, as mentioned in Jayadeva et al (2007), the two non‐parallel proximal hyperplanes are designed in such a way that each hyperplane is as close as possible to one class and as far as possible from the other class.…”
Section: Introductionmentioning
confidence: 99%