2004
DOI: 10.1016/j.patrec.2003.09.002
|View full text |Cite
|
Sign up to set email alerts
|

A comparative analysis of structural risk minimization by support vector machines and nearest neighbor rule

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2005
2005
2021
2021

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 30 publications
(10 citation statements)
references
References 16 publications
0
10
0
Order By: Relevance
“…This test has been done for real data and the results obtained from the test show that ANN acts better than the other two methods. B. Karacali et al [17] have comp ared SVM and KNN methods by using error rate and finally by combining these two methods and by using the power of SVM and simplicity of k-NN have expanded a synthesis classifier which has the advantages of two methods. M. O'Farrell et al [18] have compared k-NN and ANN in classification of spectral data.…”
Section: Related Work; Background and Motivationmentioning
confidence: 99%
“…This test has been done for real data and the results obtained from the test show that ANN acts better than the other two methods. B. Karacali et al [17] have comp ared SVM and KNN methods by using error rate and finally by combining these two methods and by using the power of SVM and simplicity of k-NN have expanded a synthesis classifier which has the advantages of two methods. M. O'Farrell et al [18] have compared k-NN and ANN in classification of spectral data.…”
Section: Related Work; Background and Motivationmentioning
confidence: 99%
“…This test has been done for real data and results of the test show that ANN could act better than the other two methods. Karacali et al [17] have compared SVM and k-NN methods using error rate, and finally by combining these two methods and using the power of SVM and simplicity of k-NN have gained a synthesis classifier which has the advantages of the two methods. O'Farrell et al [18] have compared k-NN and ANN in classification of the spectral data.…”
Section: Related Workmentioning
confidence: 99%
“…By mapping the signal data into higher dimension space, data points in the space tended to be more disperse, which made it easier for the model to classify the samples into different clusters. Besides, SVM is based on the principle of structural risk minimization, which will enhance the robust of model significantly [5]. However the complex of SVM not only brought the steady and robust of the model, but also the difficulty of parameter selecting.…”
Section: Introductionmentioning
confidence: 99%