2021
DOI: 10.1109/lra.2021.3091698
|View full text |Cite
|
Sign up to set email alerts
|

Temporal Dilation of Deep LSTM for Agile Decoding of sEMG: Application in Prediction of Upper-Limb Motor Intention in NeuroRobotics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(18 citation statements)
references
References 49 publications
0
8
0
Order By: Relevance
“…First, we determine the previous contents required for forgetting gate in the process of graphic language representation and recognition, as well as the data memory to be retained in neurons in the optimized model. e specific expressions are shown in formulas ( 8) and (9).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…First, we determine the previous contents required for forgetting gate in the process of graphic language representation and recognition, as well as the data memory to be retained in neurons in the optimized model. e specific expressions are shown in formulas ( 8) and (9).…”
Section: Methodsmentioning
confidence: 99%
“…When this pattern is generally recognized, it can be seen through graphics that the purpose of guiding behavior becomes a visual effect. e capture and understanding of the visual effect by the final target object form a complete visual experience [9]. erefore, in visual communication design, we need to be extra cautious in the selection of graphics.…”
mentioning
confidence: 99%
“…For example, in Reference [7], the authors proposed and used the CNN architecture to extract spatial information from sEMG signals and perform HGR classification. In addition to CNN-based architectures, some researches [33], [34] used Recurrent Neural Networks (RNNs) to extract the temporal features from the sEMG signals. RNNs are used because sEMG signals are sequential in the nature, and recurrent-based networks such as Long Short Term Memory (LSTM) can extract the patterns in a sequence of sEMG data treating HGR as a sequence modeling task.…”
Section: Related Workmentioning
confidence: 99%
“…Simply speaking, it means that LSTM can have better performance in longer sequences compared with normal RNN. It is suitable for processing bioelectrical signals with time-series characteristics [64,65]. Further, it is also suitable for the time-series feature RMS we employ here.…”
Section: Grip Force Prediction Algorithm and Hand Rehabilitation Cont...mentioning
confidence: 99%