2021
DOI: 10.1016/j.bspc.2021.102577
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of machine learning methods in sEMG signal processing for shoulder motion recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
14
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 24 publications
(16 citation statements)
references
References 39 publications
0
14
0
Order By: Relevance
“…The support vector machine (SVM) and CNN are two popular ML methods used for ML-based EMG signal processing. Only a few studies successfully demonstrated the ML approaches to process EMG signals for real-time robot control using both RMS and frequency variables (Zhou et al, 2021 ). A powerful computer and LabView software with a machine learning toolbox are required to implement the ML tasks with significant efforts devoted to establishing portal communication between EMG sensor systems and ML processing toolbox software.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The support vector machine (SVM) and CNN are two popular ML methods used for ML-based EMG signal processing. Only a few studies successfully demonstrated the ML approaches to process EMG signals for real-time robot control using both RMS and frequency variables (Zhou et al, 2021 ). A powerful computer and LabView software with a machine learning toolbox are required to implement the ML tasks with significant efforts devoted to establishing portal communication between EMG sensor systems and ML processing toolbox software.…”
Section: Discussionmentioning
confidence: 99%
“…Artificial intelligence and machine learning (ML) are emerging techniques in EMG signal processing for motion pattern recognition and robot control. However, the ML-based firmware is expensive and significant efforts are needed for algorithm development (Jiang et al, 2020 ; Zhou et al, 2021 ). In recent years, deep learning-based human-robot interaction (HRI) was developed (Qi et al, 2021 , 2022 ; Su et al, 2021a ) and achieved a higher recognition accuracy and faster inference speed with the help of GPU.…”
Section: Introductionmentioning
confidence: 99%
“…Jose et al (2017) extracted the time domain sEMG features of the subjects' forearm movement and classified the features using a multi-layer perception network, with a classification accuracy of 91.6%. In the literature (Zhou et al, 2021), the machine learning method was applied to the recognition of shoulder movements, and the support vector machine (SVM) method with a sliding time window of 270ms was used, and the classification accuracy was more than 90% (Hinton et al, 2012). Narayan et al (2018) extracted the features of the sEMG signal by first-order differential and classified the features by a medium tree classifier, which improved the classification accuracy by 6% compared with other features.…”
Section: Semg-based Hri Related Studymentioning
confidence: 99%
“…Traditional sEMG-based HRI usually extracts hand-crafted features and then uses machine learning methods to build mappings of hand-crafted features and different lower limb movements (Jose et al, 2017;Motoche and Benalcázar, 2018;Narayan et al, 2018;Khiabani and Ahmadi, 2021;Zhou et al, 2021). The interface is computationally cheap and can achieve a relatively good lower-limb movements prediction performance in most cases.…”
Section: Introductionmentioning
confidence: 99%
“…sEMG signals have verified that there is good performance for the human intention prediction because sEMG signals can immediately reflect the muscle activity and movement intention [23]. In general, the basic process of sEMG-based human intention prediction includes data preprocessing, data segmentation, feature extraction and classifier design [24]. The action-recognition accuracy, as well as the real-time and model robustness, are very important factors during the process of rehabilitation training.…”
Section: Introductionmentioning
confidence: 99%