2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) 2019
DOI: 10.1109/embc.2019.8856699
|View full text |Cite
|
Sign up to set email alerts
|

Integration of Forearm sEMG Signals with IMU Sensors for Trajectory Planning and Control of Assistive Robotic Arm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 11 publications
0
2
0
Order By: Relevance
“…Surface EMG signals of participants were recorded and used to train custom neural networks for each test subject. Based on our previous research on hand gesture-based powered wheelchair control and virtual robotic arm control, it was decided to use the scaled conjugate backpropagation type of neural network, due to its higher accuracy and shorter training time [ 24 , 25 ]. The MATLAB neural network pattern recognition toolbox offers several different graphical representations of the accuracy of neural networks, including confusion matrices and error histograms.…”
Section: Resultsmentioning
confidence: 99%
“…Surface EMG signals of participants were recorded and used to train custom neural networks for each test subject. Based on our previous research on hand gesture-based powered wheelchair control and virtual robotic arm control, it was decided to use the scaled conjugate backpropagation type of neural network, due to its higher accuracy and shorter training time [ 24 , 25 ]. The MATLAB neural network pattern recognition toolbox offers several different graphical representations of the accuracy of neural networks, including confusion matrices and error histograms.…”
Section: Resultsmentioning
confidence: 99%
“…However, for decades, researchers in academia and industry have tried to develop more innovative and more inclusive types of interfaces. Voice control and speech recognition systems [40], electrooculography-based interfaces [41], eye-gaze detection devices [42], vision-based gesture recognition algorithms [43], and bio signal-based interfaces [44] are developed to translate user intentions to robot commands to perform specific tasks. In addition to the standard interfaces, various approaches were also proposed for intelligent detection and grasping of objects within the workspace of the ARM [45].…”
Section: Assistive Robotic Manipulatorsmentioning
confidence: 99%
“…The existing human-machine collaboration technology is usually implemented by using a series of equipment to detect the signals issued by the human body, such as the muscle electrical signals [3], or the coordination of camera and machine vision. These methods have high accuracy [4], but a lot of computing resources are consumed, and there are other problems such as low sampling rate, complex configuration, long calibration time and so on.…”
Section: Introductionmentioning
confidence: 99%