AIP Conference Proceedings 2009
DOI: 10.1063/1.3146187
|View full text |Cite
|
Sign up to set email alerts
|

Validation Based Modified K-Nearest Neighbor

Abstract: Articles you may be interested inFinger vein identification using fuzzy-based k-nearest centroid neighbor classifier AIP Conf. Proc. 1643, 649 (2015); 10.1063/1.4907507Breast cancer classification using cluster k-nearest neighbor AIP Conf.A generalized K-nearest neighbor decision rule for isolated word recognition Abstract. In this paper, a new classification method for enhancing the performance of K-Nearest Neighbor is proposed which uses robust neighbors in training data. The robust neighbors are detected us… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
39
0
1

Year Published

2012
2012
2018
2018

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 50 publications
(47 citation statements)
references
References 20 publications
0
39
0
1
Order By: Relevance
“…Parvin et al [20] proposed the modified k-nearest neighbor (MKNN) method so far. This method has a similar idea to our proposed MkNN classifier on the point that it utilizes the information of the neighborhood of POI.…”
Section: Step1mentioning
confidence: 99%
“…Parvin et al [20] proposed the modified k-nearest neighbor (MKNN) method so far. This method has a similar idea to our proposed MkNN classifier on the point that it utilizes the information of the neighborhood of POI.…”
Section: Step1mentioning
confidence: 99%
“…Automatic classification of data points in training set based on k value was a research finding of G Guo.In 2003 SC Baguai and K Pal assigned rank to training data for each category by using Gaussian distribution, thus it dominated the other variations of kNN [9]. Modified kNN [10] suggested by Parvin et.al in 2008 used weights and validity of data points to classify nearest neighbor and used various outlier elimination techniques. In the year, 2009 Zeng et.al planned a novel idea for kNN by using n-1 classes to the entire training to address the computational complexity [11].…”
Section: Literature Surveymentioning
confidence: 99%
“…Inspired by the neighbor concept (Parvin, Alizadeh, & Minaei-Bidoli, 2008), we suppose that the quality of a sample is determined by its neighbors. Hence, we provide the following formula to compute the quality.…”
Section: Quality Evaluationmentioning
confidence: 99%