2021 IEEE Biomedical Circuits and Systems Conference (BioCAS) 2021
DOI: 10.1109/biocas49922.2021.9645009
|View full text |Cite
|
Sign up to set email alerts
|

Residual Learning Attention CNN for Motion Intention Recognition Based on EEG Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…Of the seven baseline models, five models employ attention mechanisms. It is worth noting that [46] and [44] employ a global attention mechanism that encompasses all three dimensions. However, their approach is characterized by extracting attention scores individually for each dimension after using the pooling methods to minimize unrelated dimensions.…”
Section: A Comparison With State-of-the-art Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Of the seven baseline models, five models employ attention mechanisms. It is worth noting that [46] and [44] employ a global attention mechanism that encompasses all three dimensions. However, their approach is characterized by extracting attention scores individually for each dimension after using the pooling methods to minimize unrelated dimensions.…”
Section: A Comparison With State-of-the-art Methodsmentioning
confidence: 99%
“…This method intertwines a two-layer CNN and an attention model, feeding their concatenated outputs into a classifier. A notable trend is the use of global attention in conjunction with three sequential models, as seen in [44]- [46]. These works employ three sequential attention models, each dedicated to a single dimension.…”
Section: B Attention Mechanismmentioning
confidence: 99%
See 1 more Smart Citation