2020 4th International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT) 2020
DOI: 10.1109/ismsit50672.2020.9254720
|View full text |Cite
|
Sign up to set email alerts
|

Classification of Diabetes Dataset with Data Mining Techniques by Using WEKA Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
22
1
8

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 29 publications
(34 citation statements)
references
References 21 publications
1
22
1
8
Order By: Relevance
“…Their developed model achieved the performance regarding precision values: 94.49%, 93.57%, 75.6%, and 100%, recall values: 98.62%, 84.89%, 81.78%, and 100%, F-measure values: 96.32%, 88.8%, 77.12%, and 100%, accuracy values: 96.74%, 85.73%, 75.78%, and 100%, and AUC values: 99%, 87%, 76%, and 100%, for datasets 1, 2, 3, 4, respectively. However, Alpan and Ilgi [13] employed various ML models to perform an early-stage diabetes risk prediction dataset to predict diabetes. Their employed model: KNN, received an accuracy value of 98.07% while predicting diabetes disease.…”
Section: A Traditional Machine Learning Techniquesmentioning
confidence: 99%
See 4 more Smart Citations
“…Their developed model achieved the performance regarding precision values: 94.49%, 93.57%, 75.6%, and 100%, recall values: 98.62%, 84.89%, 81.78%, and 100%, F-measure values: 96.32%, 88.8%, 77.12%, and 100%, accuracy values: 96.74%, 85.73%, 75.78%, and 100%, and AUC values: 99%, 87%, 76%, and 100%, for datasets 1, 2, 3, 4, respectively. However, Alpan and Ilgi [13] employed various ML models to perform an early-stage diabetes risk prediction dataset to predict diabetes. Their employed model: KNN, received an accuracy value of 98.07% while predicting diabetes disease.…”
Section: A Traditional Machine Learning Techniquesmentioning
confidence: 99%
“…Their developed model generated an accuracy of 98% while predicting chronic liver disease. After examining the prior investigations, it has been revealed that various past studies included either for generating various dimension reduction approaches: attribute permutation and hierarchical clustering approach, binary firefly algorithm, cooperative coevolution technique, LDA, NCA, ReliefF, Chaotic Darcy optimization, CSO, KH, BFO included in [22], [28], [30], [32], [33], [34], [36], or employing various single ML classifiers: DT, SVM, RF, MLP, NB, LR, KNN, XGB, LGBM, SVM-linear included in [5], [9], [13], [14], [16], [17], [18], [21], [23], or developing several combined approaches consisting of various outlier detection and removal approaches along with the imbalance learning algorithms: cluster-based oversampling technique, DBSCAN with SMOTE, Isolation forest with SMOTETomek, IQR algorithm with SMOTE, Instance selection with SMOTE included in [10], [11], [12], [19], [26], or utilizing only single imbalance learning algorithms: SMOTE, SVM-SMOTE included in [20], [24], [25], [27], or implementing hyperparameter optimization strategies: 2level genetic optimizer with c-type SVM, LR with GA optimization strategy included in [29], [35], or implementing several DL-enabled techniques: Conv-LSTM, deep extreme learning model, LSTM, ANN, deep neural network, MLP, convergent artificial intelligence model, deep convolutional neural network, successive encoder-decoder approach, VAE, CLUSTIMP included in [2], [3], [37], [38], [39], [40], [41], [42], [44], [45], [46], [47], [48],…”
Section: Ensemble Learning Techniquesmentioning
confidence: 99%
See 3 more Smart Citations