2021
DOI: 10.1088/1741-2552/abe20e
|View full text |Cite
|
Sign up to set email alerts
|

Decoding and interpreting cortical signals with a compact convolutional neural network

Abstract: Objective. Brain–computer interfaces (BCIs) decode information from neural activity and send it to external devices. The use of Deep Learning approaches for decoding allows for automatic feature engineering within the specific decoding task. Physiologically plausible interpretation of the network parameters ensures the robustness of the learned decision rules and opens the exciting opportunity for automatic knowledge discovery. Approach. We describe a compact convolutional network-based architecture for adapti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
23
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 17 publications
(23 citation statements)
references
References 38 publications
(73 reference statements)
0
23
0
Order By: Relevance
“…As shown in [40] if we relax the assumption about the length of the data chunk being equal to the length of the temporal convolution filter we can arrive at Fourier domain representation of dynamics of a neuronal population as pattern Q m ( f ) derived from the power spectral density (PSD) of the spatially filtered data v m [ n ] and the Fourier transform H m ( f ) of the temporal weights vector h m ( f ) as in 5: The important distinction that contrasts our weights interpretation approach from the methodology used in the majority of reports utilizing neural networks with separable spatial and temporal filtering operations is that our procedure accounts for the fact that the spatial filter formation is taking place within the context set by the corresponding temporal filter, and vice versa. Also, in [40] the authors for the first time introduced the notion of the frequency domain pattern Q m ( f ) of neuronal population’s activity. Note that Q m ( f ) vs. H m ( f ) has the same difference as spatial pattern vs. spatial filter weights which was brilliantly illustrated earlier in [17].…”
Section: Methodsmentioning
confidence: 96%
See 4 more Smart Citations
“…As shown in [40] if we relax the assumption about the length of the data chunk being equal to the length of the temporal convolution filter we can arrive at Fourier domain representation of dynamics of a neuronal population as pattern Q m ( f ) derived from the power spectral density (PSD) of the spatially filtered data v m [ n ] and the Fourier transform H m ( f ) of the temporal weights vector h m ( f ) as in 5: The important distinction that contrasts our weights interpretation approach from the methodology used in the majority of reports utilizing neural networks with separable spatial and temporal filtering operations is that our procedure accounts for the fact that the spatial filter formation is taking place within the context set by the corresponding temporal filter, and vice versa. Also, in [40] the authors for the first time introduced the notion of the frequency domain pattern Q m ( f ) of neuronal population’s activity. Note that Q m ( f ) vs. H m ( f ) has the same difference as spatial pattern vs. spatial filter weights which was brilliantly illustrated earlier in [17].…”
Section: Methodsmentioning
confidence: 96%
“…The ED during training can potentially adapt to extracting instantaneous power of specific neuronal populations activity pivotal for the downstream task of predicting the LMSCs. In the search for the optimum, the ED weights are not only tuned to such a target source but also tune away from the interfering sources [17, 40]. The proper interpretation of the learnt ED’s weights allows for subsequent discovery of the target source geometric and dynamical properties.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations