2016 2nd International Conference on Advanced Technologies for Signal and Image Processing (ATSIP) 2016
DOI: 10.1109/atsip.2016.7523093
|View full text |Cite
|
Sign up to set email alerts
|

A DWT-entropy-ANN based architecture for epilepsy diagnosis using EEG signals

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(18 citation statements)
references
References 15 publications
0
17
0
Order By: Relevance
“…Detection of Alzheimer's also achieved a classification rate of 100% by using a feed forward neural network and DCT [13]. The usage of DWT, Shannon entropy and FFNN [15] provided an accuracy of 100% for the diagnosis of Epilepsy. An accuracy of 94% was obtained for the detection of Parkinson's disease using SPECT images and sequential grass-fire algorithm [16].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Detection of Alzheimer's also achieved a classification rate of 100% by using a feed forward neural network and DCT [13]. The usage of DWT, Shannon entropy and FFNN [15] provided an accuracy of 100% for the diagnosis of Epilepsy. An accuracy of 94% was obtained for the detection of Parkinson's disease using SPECT images and sequential grass-fire algorithm [16].…”
Section: Discussionmentioning
confidence: 99%
“…However, wavelet transform requires a lot of storage and computational cost is high. Khalil et al [15] proposed a technique which made use of Discrete wavelet transform (DWT) for splitting the EEG signal. The signal comprised of various frequency sub-band which are: alpha, beta, gamma, delta and theta.…”
Section: Diagnosis Of Epilepsymentioning
confidence: 99%
“…In details, its algorithm initializes the value of K from 1 (setting as initial iteration value). After loading data, iteration from initial K =1 (generally) to the total number of training data point while distances specifically Euclidean distance between test data and each row of training data is measured and sorted in ascending order to get topmost k rows from the sorted array and the most frequent class is returned as the predicted class [22]. The value of K was tuned, and the k for best efficiency was chosen in the classifier model in this research to reduce overfitting.…”
Section: K-nearest Neighbours (Knn)mentioning
confidence: 99%
“…Ensemble learning models, such as Random forests are made of individual decision trees with a logic of group of weak learners to finally make a strong learner while the decision trees operate as divided or conquer. A class is predicted from every decision tree and a final class is predicted by model depending on their vote [22]. Two parameters were tuned in the RFC models in this study, namely, 'n_estimate', which implies the number of trees in the forest and 'max-depth' which signifies the depth of each tree.…”
Section: Random Forest Classifier (Rfc)mentioning
confidence: 99%
“…There are several studies that use different entropy measures for EEG signals such as Rényi entropy, 25 diffusion entropy, 26 and Kraskov entropy. 27 EEG complexity measures as feature vectors are also used for discrimination of different kinds of disorders [28][29][30][31] via multiple classification methods such as SVM 32 and ANN. 33 The literature includes several studies on SUD that are classified via different ML approaches.…”
Section: Introductionmentioning
confidence: 99%