2021
DOI: 10.3390/s21020498
|View full text |Cite
|
Sign up to set email alerts
|

Elbow Motion Trajectory Prediction Using a Multi-Modal Wearable System: A Comparative Analysis of Machine Learning Techniques

Abstract: Motion intention detection is fundamental in the implementation of human-machine interfaces applied to assistive robots. In this paper, multiple machine learning techniques have been explored for creating upper limb motion prediction models, which generally depend on three factors: the signals collected from the user (such as kinematic or physiological), the extracted features and the selected algorithm. We explore the use of different features extracted from various signals when used to train multiple algorit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 42 publications
(35 reference statements)
0
9
0
Order By: Relevance
“…Figure 3 shows the links of key intelligent technologies, which are indispensable. erefore, to use intelligent analysis technology, it is necessary to have a general understanding of these key links and how to use them [22]. e…”
Section: Improved Bp Neural Algorithmmentioning
confidence: 99%
“…Figure 3 shows the links of key intelligent technologies, which are indispensable. erefore, to use intelligent analysis technology, it is necessary to have a general understanding of these key links and how to use them [22]. e…”
Section: Improved Bp Neural Algorithmmentioning
confidence: 99%
“…Despite the high correlation between sEMG and the intensity of neural drives to target muscles, sEMG signals alone may not be adequate enough for many practical applications of multi-functional upper-limb HMI, mainly because of 1) the large number of DoFs and non-cyclical nature of the upper extremity's movements [28]; 2) the complex patterns of EMG influenced by the anatomical and physiological properties of muscles, such as the limited spatial resolution caused by muscle cross-talk [27]. To this end, the fusion of sEMG with other signals have gained considerable attention, such that more complementary information can be obtained from the environment to compensate the shortcomings of sEMG.…”
Section: A Multi-modal Sensing Fusionmentioning
confidence: 99%
“…Different from previous surveys, this paper provides a systematic review on recent progress towards model robustness, adaptation, and reliability in ML/DL based upper-limb myoelectric control. Firstly, the main factors that limit ML/DL implementations can be summarised as follows: 1) upper-limb movements are non-cyclical and have a large number of DoFs involved, whereas the information provided by sEMG signals may not be adequate enough for precise control [27,28]; 2) characteristics of sEMG are time-varying and user-specific, in the meantime they can be easily influenced by numerous disturbances in practical environments [22]; 3) high estimation accuracy can still lead to unintended activation, causing additional operations, cognitive burdens, and even unacceptable risks [20]. In this context, related efforts will be introduced accordingly in three aspects: 1) multi-modal fusion techniques to provide additional information in myoelectric control; 2) transfer learning methods to reduce domain shift impacts on ML/DL algorithms; and 3) post-processing approaches to enhance reliability of estimation outcomes.…”
Section: Introductionmentioning
confidence: 99%
“…Recent developments in the use of attention based models have shown their potential for accurate labelling of multimodal wearable sensors data, such as data from HAR tasks performed with upper and lower limbs [101,102]. Figure 5 shows examples of the use of LSTM, ANN, HMM, DBN, RF and DT methods for activity recognition with different wearable sensor platforms [81,82,89,103,104].…”
Section: Deep Learningmentioning
confidence: 99%