2010
DOI: 10.1016/j.ejor.2010.03.017
|View full text |Cite
|
Sign up to set email alerts
|

Simultaneous classification and feature selection via convex quadratic programming with application to HIV-associated neurocognitive disorder assessment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(13 citation statements)
references
References 9 publications
0
13
0
Order By: Relevance
“…The SVM method used, pq À SVM, was a modification of the Lagrangian Support Vector Machine (LSVM) method of Mangasarian and Musicant [26], incorporating feature selection [27]. The inequality for prediction of NP impairment using the nonnormalized original data was scaled to a range of values of approximately À 10 to 1 10, for uniformity between scenarios.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The SVM method used, pq À SVM, was a modification of the Lagrangian Support Vector Machine (LSVM) method of Mangasarian and Musicant [26], incorporating feature selection [27]. The inequality for prediction of NP impairment using the nonnormalized original data was scaled to a range of values of approximately À 10 to 1 10, for uniformity between scenarios.…”
Section: Discussionmentioning
confidence: 99%
“…. ; 10g [27,28]. The steps of randomly choosing two-thirds of the data for training, the calculation of optimal parameters over the grid of values, and the choice of tuning parameters and predictor coefficients that achieve maximal testing efficiency were then repeated 1000 times.…”
mentioning
confidence: 99%
“…Results for the Int method tend to be slightly better than those reported by Bradley and Mangasarian (2004) and comparable to the linear methods in Fung and Mangasarian (2004). Results for Int are much better than those reported by Dunbar et al [2010] for the wdbc data set, and slightly worse for the pima data set.…”
Section: Figure 1: Average Performance Relative To Bestmentioning
confidence: 54%
“…Guo and Dyer [2003] report good results when these techniques are applied in facial expression recognition. Dunbar et al [2010] formulate the simultaneous hyperplane placement and feature selection problem as a nonlinear support vector machine problem, and then reformulate it as a quadratic minimization problem subject to nonnegativity constraints. This is an extension of a method originally proposed by .…”
Section: Introductionmentioning
confidence: 99%
“…The support vector machine has emerged as a powerful modelling tool for machine learning problems of data classification that arise in many areas of information and computer sciences [24][25][26][27].…”
Section: Introductionmentioning
confidence: 99%