2023 11th International IEEE/EMBS Conference on Neural Engineering (NER) 2023
DOI: 10.1109/ner52421.2023.10123866
|View full text |Cite
|
Sign up to set email alerts
|

On Transfer Learning for Naive Brain Computer Interface Users

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 18 publications
1
2
0
Order By: Relevance
“…One recording session consisted of four runs of MI task performance, with one run comprising 20 trials-10-10 for left and right directions-in a randomized order. This dataset was also analyzed and verified in previous studies (Liu et al, 2023;Kumar et al, 2024). In the online demonstration, subjects performed four offline runs to collect calibration data for training the decoder, then performed five runs online on the same day (Figure 1, right), hence to reduce the effects of between-session non-stationarities.…”
Section: Materials and Methods Participants And Experimental Proceduressupporting
confidence: 72%
See 2 more Smart Citations
“…One recording session consisted of four runs of MI task performance, with one run comprising 20 trials-10-10 for left and right directions-in a randomized order. This dataset was also analyzed and verified in previous studies (Liu et al, 2023;Kumar et al, 2024). In the online demonstration, subjects performed four offline runs to collect calibration data for training the decoder, then performed five runs online on the same day (Figure 1, right), hence to reduce the effects of between-session non-stationarities.…”
Section: Materials and Methods Participants And Experimental Proceduressupporting
confidence: 72%
“…Furthermore, this adaptation can be carried out online in a completely unsupervised manner, i.e., without knowing ground truth labels of the incoming test examples (Kumar et al, 2024). Combining rebiasing with DCCA, our proposed approach managed to achieve a between-session performance comparable to those obtained with sophisticated deep learning models evaluated on the same exact dataset (Liu et al, 2023).…”
Section: The Dcca-riemannian-mdm Approachmentioning
confidence: 77%
See 1 more Smart Citation