2021
DOI: 10.3390/brainsci11010075
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid Deep Learning (hDL)-Based Brain-Computer Interface (BCI) Systems: A Systematic Review

Abstract: Background: Brain-Computer Interface (BCI) is becoming more reliable, thanks to the advantages of Artificial Intelligence (AI). Recently, hybrid Deep Learning (hDL), which combines different DL algorithms, has gained momentum over the past five years. In this work, we proposed a review on hDL-based BCI starting from the seminal studies in 2015. Objectives: We have reviewed 47 papers that apply hDL to the BCI system published between 2015 and 2020 extracting trends and highlighting relevant aspects to the topic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
48
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 70 publications
(56 citation statements)
references
References 81 publications
0
48
0
Order By: Relevance
“…Moreover, regarding the necessity of cleaning the data, different pre-processing techniques, referred to supervised ICA approach, were highly time-consuming. On this aspect, neural networks can help since they do not need a strong pre-processing as shown in some studies [ 49 ]. In those studies, only minimal pre-processing such as removing or interpolating bad channels were used and all the rest was left to the burden of learning from a potentially noisy signal to the neural network [ 49 ].…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Moreover, regarding the necessity of cleaning the data, different pre-processing techniques, referred to supervised ICA approach, were highly time-consuming. On this aspect, neural networks can help since they do not need a strong pre-processing as shown in some studies [ 49 ]. In those studies, only minimal pre-processing such as removing or interpolating bad channels were used and all the rest was left to the burden of learning from a potentially noisy signal to the neural network [ 49 ].…”
Section: Discussionmentioning
confidence: 99%
“…On this aspect, neural networks can help since they do not need a strong pre-processing as shown in some studies [ 49 ]. In those studies, only minimal pre-processing such as removing or interpolating bad channels were used and all the rest was left to the burden of learning from a potentially noisy signal to the neural network [ 49 ]. On the other hand, machine learning methods based on artificial neural networks with the ability to use techniques that allow a system to automatically detect and classify features from raw data, including unsupervised training of raw input data (i.e., automatic feature selection and dimensionality reduction), are highly computationally expensive to train and to determine the hyperparameters.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Artificial neural networks (ANNs) are favourably recognized computer models developed based on the Homo sapiens brain and its networking trends [ 16 ]. The simplest case shows a fully connected network or feed-forward that shapes the computation chart with three layers (input layer, hidden layer, and output layer) [ 17 ]. The layer-wise single computing unit called neurons works as a nonlinear transformation to the input data.…”
Section: Artificial Neural Networkmentioning
confidence: 99%
“…The most common approach is a convolutional neural network with a variety of architectures such as temporal CNNs (TCNNs), temporal graph convolutional networks (TGCNs) and CNN-recurrent neural networks (RNNs) [161]. A CNN's basic structure consists of convolutional layers, max pooling layers, fully connected layers and softmax layers [162,163].…”
Section: Deep Learning Techniquesmentioning
confidence: 99%