2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) 2017
DOI: 10.1109/embc.2017.8036844
|View full text |Cite
|
Sign up to set email alerts
|

sEMG feature selection and classification using SVM-RFE

Abstract: It is challenging to obtain good results for hand movements classification. Previous studies expended efforts on filters for sEMG data, feature extraction and classifier algorithms to achieve the best results. This paper proposes the insertion of a step in the classification process that selects which features to use in training aiming to increase accuracy and performance. Feature selection was previously used in other classification tasks but is new in wrist/fingers movements classification. Obtained results … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 19 publications
(9 citation statements)
references
References 26 publications
0
9
0
Order By: Relevance
“…The optimal features should be extracted by some criteria. Tosin et al [ 42 ] demonstrated that RFE was a powerful feature selection algorithm. However, the output was a list of ranks in separability without detailed values.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…The optimal features should be extracted by some criteria. Tosin et al [ 42 ] demonstrated that RFE was a powerful feature selection algorithm. However, the output was a list of ranks in separability without detailed values.…”
Section: Discussionmentioning
confidence: 99%
“…In contrast, embedded methods rely on criteria that are generated during the classifier training process. Examples of embedded techniques are the support vector machine (SVM)-based Recursive Feature Elimination (RFE) [ 42 ] and the linear discriminant analysis (LDA)-based Fisher’s Discriminant (FD) function [ 17 ].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The accuracy of an SVM classifier is sensitive to the choice of features, which has been reported in the past . This motivated us to perform feature optimization for this particular study.…”
Section: Methodsmentioning
confidence: 92%
“…The accuracy of an SVM classifier is sensitive to the choice of features, which has been reported in the past. 32,33 This motivated us to perform feature optimization for this particular study. At first, the 12-fold CV SVM classification error was obtained for individual features, for both multiclass as well as binary classification tasks.…”
Section: Statistical Analysis Of Features From Selectedmentioning
confidence: 99%