2020
DOI: 10.48550/arxiv.2001.08349
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Investigating naturalistic hand movements by behavior mining in long-term video and neural recordings

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…Through a series of systematic experiments, we use decoders tailored to each participant-by far the most common approach to training decodersas a standard to which the performance of our generalized decoders are compared (figure 1(B)). In particular, our training data are from 12 ECoG participants during uninstructed, naturalistic arm movements (figure 1(C), [57,38]); our test data are then either one ECoG participant withheld from the training set, or participants from an entirely independent EEG dataset. HTNet consistently outperformed other decoders, and fine-tuning pre-trained HTNet decoders with a small number of the unseen participant's events yielded decoders that approached the performance of tailored decoders trained on many more events.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Through a series of systematic experiments, we use decoders tailored to each participant-by far the most common approach to training decodersas a standard to which the performance of our generalized decoders are compared (figure 1(B)). In particular, our training data are from 12 ECoG participants during uninstructed, naturalistic arm movements (figure 1(C), [57,38]); our test data are then either one ECoG participant withheld from the training set, or participants from an entirely independent EEG dataset. HTNet consistently outperformed other decoders, and fine-tuning pre-trained HTNet decoders with a small number of the unseen participant's events yielded decoders that approached the performance of tailored decoders trained on many more events.…”
Section: Resultsmentioning
confidence: 99%
“…Our decoding task was to classify upper-limb 'move' and 'rest' events of the arm contralateral to the implanted electrode hemisphere. We obtained non-concurrent move and rest events from video recordings via markerless pose tracking and automated state segmentation (see Singh et al [57] for further details). Move events correspond to wrist movement that occurred after at least 0.5 s of no movement, while rest events indicate no movement in either wrist for at least 3 s.…”
Section: Intracranial Electrocorticography (Ecog) Datasetmentioning
confidence: 99%