2020
DOI: 10.1007/978-981-15-5616-6_24
|View full text |Cite
|
Sign up to set email alerts
|

An Analysis of Computational Complexity and Accuracy of Two Supervised Machine Learning Algorithms—K-Nearest Neighbor and Support Vector Machine

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(8 citation statements)
references
References 7 publications
0
7
0
Order By: Relevance
“…The complexity of SVM has been well explored and confirmed. Referring to several pieces of research [44,45], we concluded that the complexity of linear SVM would be O(d) and that of RBF-SVM would be O d 2 , where d denotes the dimension of input.…”
Section: Discussionmentioning
confidence: 98%
“…The complexity of SVM has been well explored and confirmed. Referring to several pieces of research [44,45], we concluded that the complexity of linear SVM would be O(d) and that of RBF-SVM would be O d 2 , where d denotes the dimension of input.…”
Section: Discussionmentioning
confidence: 98%
“…1TBA functionalized nanopapers failed to distinguish lactose but showed an acceptable accuracy for classifying the other glycans. When the number of the classification groups increases, the computation complexity of machine learning classifier increases dramatically 42 . To evaluate the limitation of the machine learning classifier, we analyzed all of the spectra previously collected from 11 sample groups, including 7 monosaccharides, maltose, isomaltose, lactose, and the mixture of galactose and glucose.…”
Section: Enhancing Classification Accuracy Using the Collective Sers ...mentioning
confidence: 99%
“…These findings show that the circuit can solve linear systems quickly in a wide range of applications, indicating that IMC is a promising choice for future big data and machine learning accelerators. Ray [11] demonstrates that the complexity of KNN and SVM is analyzed, and other complexity-reducing methodologies are then examined. It is also suggested to use a hybrid method that combines KNN and SVM to use the possibilities of the two techniques.…”
Section: Related Workmentioning
confidence: 99%
“…To test the model's performance, a new equivalent dataset with only the input characteristics is supplied, and the goal of the model is to predict the target variable using the knowledge gathered during training. The expected values are compared to the actual target values using an appropriate performance metric [11,13].…”
Section: Machine Learningmentioning
confidence: 99%