2018 26th Signal Processing and Communications Applications Conference (SIU) 2018
DOI: 10.1109/siu.2018.8404398
|View full text |Cite
|
Sign up to set email alerts
|

Choose of wart treatment method using Naive Bayes and k-nearest neighbors classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 22 publications
0
4
0
Order By: Relevance
“…In the literature related to prediction of the success of wart treatment method, classifier algorithms of Fuzzy Rules (Khozeimeh et al 2017b), support vector machines (Uzun et al 2018a), Naive Bayes (Uzun et al 2018b) and k-Nearest Neighbors (Uzun et al 2018b) were used previously. We explored the use of multi-layer perceptron and extremelearning machine on solving this problem.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…In the literature related to prediction of the success of wart treatment method, classifier algorithms of Fuzzy Rules (Khozeimeh et al 2017b), support vector machines (Uzun et al 2018a), Naive Bayes (Uzun et al 2018b) and k-Nearest Neighbors (Uzun et al 2018b) were used previously. We explored the use of multi-layer perceptron and extremelearning machine on solving this problem.…”
Section: Resultsmentioning
confidence: 99%
“…Khozeimeh et al (2017b) prepared the database and obtained prediction accuracies of 83.3% for immunotherapy method and 80.7% for cryotherapy method. Uzun et al (2018aUzun et al ( , 2018bUzun et al ( , 2019 Putra et al (2018) proposed the AdaBoost algorithm to determine the success of the selected wart treatment method and achieved the maximum classification performance of an accuracy of 93.89%, a sensitivity of 96.64%, and a specificity of 93.10%. Khatri et…”
Section: Validation and Performance Measuresmentioning
confidence: 99%
See 1 more Smart Citation
“…Then, we compared the proposed model with the existing prediction model -knearest neighbor classifier (KNN) [27], support vector machine (SVM) [28] and linear regression with L1 regularization (Lasso) [29], random forest (RF) [30], gradient tree boosting-based classifier implemented in the XGBoost package (XGBoost) [31], graph neural networks (GNN) [32] and graph convolution networks (GCN) [33]. As shown in Table 3, the above prediction models are trained directly with VAE.…”
Section: Combined Model Performancementioning
confidence: 99%