2018
DOI: 10.1155/2018/7538204
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Feature Selection Strategy Based on Multiple Support Vector Machine Technology with Gene Expression Data

Abstract: The application of gene expression data to the diagnosis and classification of cancer has become a hot issue in the field of cancer classification. Gene expression data usually contains a large number of tumor-free data and has the characteristics of high dimensions. In order to select determinant genes related to breast cancer from the initial gene expression data, we propose a new feature selection method, namely, support vector machine based on recursive feature elimination and parameter optimization (SVM-R… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(23 citation statements)
references
References 39 publications
0
21
0
Order By: Relevance
“…While common supervised artificial neural networks and their adaptations include single-layer perceptron (SLP) [ 10 , 31 ], multilayer perceptron (MLP) [ 42 , 43 ], and linear classifiers [ 44 , 45 ]. Also, popular supervised ANNs include support vector machines (SVMs) [ 46 , 47 ], k-nearest neighbours (kNNs) [48] . More also in the category of popular supervised ANNs are Bayesian statistics [49] , decision trees [50] and hidden Markov model (HMM) [51] .…”
Section: Main Textmentioning
confidence: 99%
“…While common supervised artificial neural networks and their adaptations include single-layer perceptron (SLP) [ 10 , 31 ], multilayer perceptron (MLP) [ 42 , 43 ], and linear classifiers [ 44 , 45 ]. Also, popular supervised ANNs include support vector machines (SVMs) [ 46 , 47 ], k-nearest neighbours (kNNs) [48] . More also in the category of popular supervised ANNs are Bayesian statistics [49] , decision trees [50] and hidden Markov model (HMM) [51] .…”
Section: Main Textmentioning
confidence: 99%
“…The solutions of (14) and (15) can be found considering them as an eigenvalue problem that leads to finding the optimum values of the projection matrix, P. In Matlab 2017a, fitclda() was used to construct the LDA based predictive model which was further utilized with 5-fold cross-validation to check the classification performance. SVM: SVM is an extensively employed classifier for its high prediction accuracy in high-dimensional features [60][61][62]. SVM can be used as a linear or nonlinear method.…”
Section: ) Conventional Classification Methodsmentioning
confidence: 99%
“…To find the optimal results of d * maximizes the distance between the hyper-plane and the support vectors. This maximization procedure is obtained by minimizing the following cost function (17) cogitating the restrictions given in (18) [60][61][62][63].…”
Section: ) Conventional Classification Methodsmentioning
confidence: 99%
“…To predict the patient's outcome in terms of metastasis, we used a machine learning approach [34] called random forest (RF) [35]. The machine learning prediction model was used to evaluate the accuracy, precision, and recall score using test data.…”
Section: Evaluation Of the Metastasis Prediction Modelmentioning
confidence: 99%