2006
DOI: 10.1007/11840930_26
|View full text |Cite
|
Sign up to set email alerts
|

Nonnegative Matrix Factorization for Motor Imagery EEG Classification

Abstract: Brain-computer interface uses brain signals to communicate with external devices without actual control. Many studies have been conducted to classify motor imagery based on machine learning. However, classifying imagery data with sparse spatial characteristics, such as single-arm motor imagery, remains a challenge. In this paper, we proposed a method to factorize EEG signals into two groups to classify motor imagery even if spatial features are sparse. Based on adversarial learning, we focused on extracting co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
81
0

Year Published

2008
2008
2015
2015

Publication Types

Select...
4
3

Relationship

4
3

Authors

Journals

citations
Cited by 54 publications
(81 citation statements)
references
References 9 publications
(1 reference statement)
0
81
0
Order By: Relevance
“…EEG classification plays a very important role in brain computer interface (BCI) where a subject's mental state is required to be estimated from EEG signals. Our previous work (Lee et al, 2006) demonstrated that NMF could extract spectral features that are useful in EEG classification. Two EEG data sets that were used in our empirical study are as follows:…”
Section: Eeg Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…EEG classification plays a very important role in brain computer interface (BCI) where a subject's mental state is required to be estimated from EEG signals. Our previous work (Lee et al, 2006) demonstrated that NMF could extract spectral features that are useful in EEG classification. Two EEG data sets that were used in our empirical study are as follows:…”
Section: Eeg Classificationmentioning
confidence: 99%
“…Feature vectors correspond to the column vectors of the encoding variable matrix S. We use the same probabilistic model-based classifier as used in (Lemm et al, 2004;Lee et al, 2006). The best performance in this experiment was achieved when α = 0.5or1 and n = 5, as summarized in Table 1.…”
Section: Graz Datasetmentioning
confidence: 99%
“…In this research, we adopt the CMORWT, because it can relate scale levels to actual frequencies [17], which enables us to relate the time-frequency components to common frequency features such as HF (0.20 -0.35 Hz) or LF (0.05 -0.2 Hz).…”
Section: Cmorwt-based Feature Extraction Of Time-frequency Areamentioning
confidence: 99%
“…The relationship matrices were built from the factor matrices with additional Gaussian noise with variance 0.01. We chose the relation (2,3) as the target matrix, where half of columns have 50% of observed entries, and the remaining columns have varying ratio of observed entries from 0% to 90%. The other relation matrices had 50% of observed entries.…”
Section: Bcrb Comparison On Synthetic Datamentioning
confidence: 99%
“…Matrix co-factorization jointly decomposes multiple data matrices, where each decomposition is coupled by sharing some factor matrices. Matrix co-factorization has been used to improve the performance of matrix factorization by incorporating knowledge in the additional matrices, such as label information [16], link information [17], and inter-subject variations [3]. One of the advantages of the matrix co-factorization is that it can be applied for the general entity-relationship models of the target data and the additional data [9,14], where the factor matrices correspond to the entities and the input matrices correspond to the relationships of the model.…”
Section: Introductionmentioning
confidence: 99%