2020
DOI: 10.1016/j.eswa.2020.113676
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive boost LS-SVM classification approach for time-series signal classification in epileptic seizure diagnosis applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 67 publications
(37 citation statements)
references
References 49 publications
0
36
0
1
Order By: Relevance
“…AdaBoost was combined with SVM for time-series signal classification in epileptic seizure diagnosis by Hadeethi et al. [ 35 ].…”
Section: Related Workmentioning
confidence: 99%
“…AdaBoost was combined with SVM for time-series signal classification in epileptic seizure diagnosis by Hadeethi et al. [ 35 ].…”
Section: Related Workmentioning
confidence: 99%
“…However, for CD‐E cases, the proposed model has best performance compared with other methods where the state‐of‐the‐art methods yield ranged by 98 and 99.33% accuracy. For AB‐CD‐E case, Al‐Hadeethi et al [7 ] and Wu et al [10 ] with 99% accuracy stay behind Jiang et al [4 ]. The proposed model yields best performance in AB‐CD‐E case.…”
Section: Resultsmentioning
confidence: 99%
“…Evaluated performance was 99.17% with the support vector machine (SVM) classifier. Al‐Hadeethi et al [7 ] presented the adaptive least squares SVM (AdaBoost LS‐SVM) classifier to detect the seizure. Also, the covariance matrix is used for feature extraction.…”
Section: Introductionmentioning
confidence: 99%
“…We also used machine learning models to perform comparison between the results if machine learning models and the proposed DS-MLP. Five machine learning models such as logistic regression (LR) [36], [37], random forest (RF) [38], [39], decision tree (DT) [40], support vector machine (SVM) [41], K nearest neighbour (KNN) [42] and Gaussian Naive Bayes (GNB) [43] algorithms are used for this purpose. We use RF with 300 n_estimators which means that 300 decision trees are constructed by RF on each example to give the predictions and then perform majority voting between 300 predictions to make the final prediction.…”
Section: B Comparison With Other Machine Learning Modelsmentioning
confidence: 99%