2021
DOI: 10.1016/j.jneumeth.2021.109274
|View full text |Cite
|
Sign up to set email alerts
|

Sparse representation-based classification with two-dimensional dictionary optimization for motor imagery EEG pattern recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 43 publications
0
2
0
Order By: Relevance
“…The unoccluded subset of 100 classes is selected as test samples. Some classical algorithms are selected for comparison, including sparse representation-based classification (SRC) [ 16 ], cyclic redundancy check (CRC) [ 17 ], regularized robust coding for face recognition (RRC) [ 18 ], low-rank matrix recovery with structural incoherence (LR) [ 19 ], extended sparse representation-based classification (ESRC) [ 20 ], and discriminative low-rank representation method (DLRR) [ 21 ]. All experiments are carried out on a computer with an Intel(R), Xeon(R), CPU E5-2630 processor, 64G memory, and MATLAB version R2014b [ 22 ].…”
Section: Methodsmentioning
confidence: 99%
“…The unoccluded subset of 100 classes is selected as test samples. Some classical algorithms are selected for comparison, including sparse representation-based classification (SRC) [ 16 ], cyclic redundancy check (CRC) [ 17 ], regularized robust coding for face recognition (RRC) [ 18 ], low-rank matrix recovery with structural incoherence (LR) [ 19 ], extended sparse representation-based classification (ESRC) [ 20 ], and discriminative low-rank representation method (DLRR) [ 21 ]. All experiments are carried out on a computer with an Intel(R), Xeon(R), CPU E5-2630 processor, 64G memory, and MATLAB version R2014b [ 22 ].…”
Section: Methodsmentioning
confidence: 99%
“…Classifications based on sparse representations of some EEG feature (SRC) 5 , 6 are increasingly useful to obtain good performance from untrained patterns 7 . This way, allowing a smaller volume of data to be necessary in the training process of a classification model 2 , 4 , 8 seek to improve this approach by focusing on optimizing and learning dictionaries for sparse representation 9 11 , in turn, choose to focus on improving the features that will be represented.…”
Section: Introductionmentioning
confidence: 99%