2006
DOI: 10.1007/978-3-540-36668-3_117
|View full text |Cite
|
Sign up to set email alerts
|

SV-kNNC: An Algorithm for Improving the Efficiency of k-Nearest Neighbor

Abstract: Abstract. This paper proposes SV-kNNC, a new algorithm for k-Nearest Neighbor (kNN). This algorithm consists of three steps. First, Support Vector Machines (SVMs) are applied to select some important training data. Then, k-mean clustering is used to assign the weight to each training instance. Finally, unseen examples are classified by kNN. Fourteen datasets from the UCI repository were used to evaluate the performance of this algorithm. SV-kNNC is compared with conventional kNN and kNN with two instance reduc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 5 publications
0
8
0
Order By: Relevance
“…A wrapper method based on SVM is proposed in Yuangui et al (2005) which works doing a double selection; the first one applies SVM to obtain V s and the second one applies DROP2 over V s . Another method related to instance selection based on SVM is SV-kNNC (Support Vector k-Nearest Neighbor Clustering) (Srisawat et al 2006) which after applying SVM over T uses the k-means algorithm for clustering V s and retains only such instances belonging to homogeneous clusters (clusters where all instances belong to the same class); when the cluster is non homogeneous, only such instances from the majority class in the cluster are preserved.…”
Section: Wrapper Methodsmentioning
confidence: 99%
“…A wrapper method based on SVM is proposed in Yuangui et al (2005) which works doing a double selection; the first one applies SVM to obtain V s and the second one applies DROP2 over V s . Another method related to instance selection based on SVM is SV-kNNC (Support Vector k-Nearest Neighbor Clustering) (Srisawat et al 2006) which after applying SVM over T uses the k-means algorithm for clustering V s and retains only such instances belonging to homogeneous clusters (clusters where all instances belong to the same class); when the cluster is non homogeneous, only such instances from the majority class in the cluster are preserved.…”
Section: Wrapper Methodsmentioning
confidence: 99%
“…Thereby, numbers of intervals and length of intervals would be produced based on the dataset. An efficient algorithm [15] has been applied to carry out this step. The SV-kNNC algorithm has less time classification than KNN.…”
Section: Phase Ii: Merging Intervalsmentioning
confidence: 99%
“…To this end, the literature provides a variety of approaches. There are instance selection methods deliberately designed for specific classifiers including K−nearest neighbors [ e.g., [11][12][13][14][15] and support vector machine [ e.g., [16][17][18]. Also, there are studies on more general methods mostly constructed on a prescribed distance metrics and applications [ e.g., [19][20][21][22][23].…”
Section: Introductionmentioning
confidence: 99%