2015
DOI: 10.18201/ijisae.75836
|View full text |Cite
|
Sign up to set email alerts
|

The Classification of Eye State by Using kNN and MLP Classification Models According to the EEG Signals

Abstract: What is widely used for classification of eye state to detect human's cognition state is electroencephalography (EEG). In this study, the usage of EEG signals for online eye state detection method was proposed. In this study, EEG eye state dataset that is obtained from UCI machine learning repository database was used. Continuous 14 EEG measurements forms the basic of the dataset. The duration of the measurement is 117 seconds (each measurement has14980 sample). Weka (Waikato Environment for Knowledge Analysis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
0
7

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 48 publications
(29 citation statements)
references
References 11 publications
0
22
0
7
Order By: Relevance
“…In [12], the k-NN classifier obtained the best accuracy in classifying EEG signals to identify the engagement, enjoyment, frustration and difficulty compared to Bayes Network, Naïve Bayes, SVM, Multilayer Perceptron, Random Forest and J48. Besides that, [13] shows that the k-NN also provides better accuracy in eye state classification compared to Multilayer Perceptron Neural Networks.…”
Section: Introductionmentioning
confidence: 94%
See 1 more Smart Citation
“…In [12], the k-NN classifier obtained the best accuracy in classifying EEG signals to identify the engagement, enjoyment, frustration and difficulty compared to Bayes Network, Naïve Bayes, SVM, Multilayer Perceptron, Random Forest and J48. Besides that, [13] shows that the k-NN also provides better accuracy in eye state classification compared to Multilayer Perceptron Neural Networks.…”
Section: Introductionmentioning
confidence: 94%
“…In k-NN, distance metrics is used to calculate distance between a new samples and existing samples in dataset. The literature is strongly influenced by a commonplace in using the Euclidean distance metric [10][11][12][13] [15][16][17][18]. In fact, we are not aware of any studies with a focus on a performance comparison among various different distance metrics for k-NN.…”
Section: Introductionmentioning
confidence: 99%
“…kNN is the simplest controlled learning algorithm among the whole machine learning algorithms [14]. It doesn't use the training data points to do any generalization so it is also called a lazy algorithm.…”
Section: K-nearest Neighbor Algorithmmentioning
confidence: 99%
“…These techniques usually need training and generalization procedure to make best estimation. On the other hand, k-nearest neighbor (kNN) doesn't need to use the training data points to do any generalization so it can be easily used for many categorizations or classification [14][15][16]. Although kNN is the simplest controlled learning algorithm, it hasn't been used to estimate operating frequency of the antennas so far.…”
Section: Introductionmentioning
confidence: 99%
“…Además, Rsler y Suendermann [26] generan un corpus con cerca de 15, 000 muestras y someten los datos a 42 clasificadores auxiliándose de la plataforma WEKA [29]. [27,28,30,31] utilizan el mismo corpus( [26]) para aplicar otros enfoques de clasificación a los ya reportados. Fig.…”
Section: Electroencefalogramasunclassified