2023
DOI: 10.1109/access.2023.3329678
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning in EEG-Based BCIs: A Comprehensive Review of Transformer Models, Advantages, Challenges, and Applications

Berdakh Abibullaev,
Aigerim Keutayeva,
Amin Zollanvari

Abstract: Brain-computer interfaces (BCIs) have undergone significant advancements in recent years. The integration of deep learning techniques, specifically transformers, has shown promising development in research and application domains. Transformers, which were originally designed for natural language processing, have now made notable inroads into BCIs, offering a unique self-attention mechanism that adeptly handles the temporal dynamics of brain signals. This comprehensive survey delves into the application of tran… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(1 citation statement)
references
References 223 publications
0
1
0
Order By: Relevance
“…Attention mechanisms have been extensively employed in various EEG signal analyses [24,25]. They enable models to focus on different parts of the input, intelligently assigning weights to different inputs based on the specific task, demonstrating excellent performance across a variety of tasks [26]. Its essence lies in computing attention coefficients for corresponding Q (Query), K (Key), and V (Value) [27], as shown in (1), thereby assigning weights to different information.…”
Section: ) Attention Module Selectionmentioning
confidence: 99%
“…Attention mechanisms have been extensively employed in various EEG signal analyses [24,25]. They enable models to focus on different parts of the input, intelligently assigning weights to different inputs based on the specific task, demonstrating excellent performance across a variety of tasks [26]. Its essence lies in computing attention coefficients for corresponding Q (Query), K (Key), and V (Value) [27], as shown in (1), thereby assigning weights to different information.…”
Section: ) Attention Module Selectionmentioning
confidence: 99%