2022
DOI: 10.1101/2022.11.02.22281775
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Online internal speech decoding from single neurons in a human participant

Abstract: Speech brain-machine interfaces (BMI's) translate brain signals into words or audio outputs, enabling communication for people having lost their speech abilities due to diseases or injury. While important advances in vocalized, attempted, and mimed speech decoding have been achieved, results for internal speech decoding are sparse, and have yet to achieve high functionality. Notably, it is still unclear from which brain areas internal speech can be decoded. In this work, a tetraplegic participant with implante… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(12 citation statements)
references
References 60 publications
0
10
0
Order By: Relevance
“…Fifteen healthy participants (5 females, mean age 23.9 years, SD ±2.3, range [19][20][21][22][23][24][25][26][27][28][29] took part in this study which was approved by the local Ethics Committee (Commission cantonale d'éthique de la recherche, project 2022-00451) and was performed in accordance with the Declaration of Helsinki. All participants provided written informed consent and received financial compensation for their participation.…”
Section: Participantsmentioning
confidence: 99%
See 1 more Smart Citation
“…Fifteen healthy participants (5 females, mean age 23.9 years, SD ±2.3, range [19][20][21][22][23][24][25][26][27][28][29] took part in this study which was approved by the local Ethics Committee (Commission cantonale d'éthique de la recherche, project 2022-00451) and was performed in accordance with the Declaration of Helsinki. All participants provided written informed consent and received financial compensation for their participation.…”
Section: Participantsmentioning
confidence: 99%
“…Although previous studies have characterized the neural correlates of imagined speech (Ikeda et al, 2014), most often in comparison with overt speech (Brumberg et al, 2016;Leuthardt et al, 2012;Martin et al, 2016;Pei et al, 2011;Proix et al, 2022;Soroush et al, 2022), only a handful of BCI studies have attempted to decode imagined speech in real time, with promising but often limited effectiveness (Angrick et al, 2021;Leuthardt et al, 2011;Sereshkeh et al, 2017;Wandelt et al, 2022). This is due to different challenges and limitations primarily pertaining to the weakness of imagined speech signals as compared to overt speech (Brumberg et al, 2016;Leuthardt et al, 2012;Martin et al, 2014;Pei et al, 2011;Proix et al, 2022), the difficulty to precisely identify the onset of speech imagery (Martin et al, 2018b), inter-individual differences in the ability to control the BCI (Marchesotti et al, 2016), and to the technique employed to record brain activity.…”
Section: Introductionmentioning
confidence: 99%
“…This study would suggest that articulation imagery led to stronger activations in the PostCG and the MFG, whereas hearing imagery did so in the STG. Also, pointing toward the idea that imagined speech might be better decoded from sensory areas, a recent study [60] found that a small set of words could be consistently classified from human intracortical recordings in the supramarginal gyrus, posterior to the PostCG and mainly thought to be involved in language processing.…”
Section: Mimed and Imagined Utterances Might Be Optimally Synthesized...mentioning
confidence: 99%
“…[5][6][7]27] extended continuous decoding to arm control, with [27] controlling the user's own muscles. Recent work has also decoded speech from sensorimotor cortex [28][29][30][31]. However, relatively few BMI studies have focused on hand control [32][33][34][35][36][37], and previous studies frequently combine the ring and little fingers or leave them out altogether.…”
Section: Introductionmentioning
confidence: 99%