DOI: 10.1007/978-3-540-74976-9_25
|View full text |Cite
|
Sign up to set email alerts
|

IKNN: Informative K-Nearest Neighbor Pattern Classification

Abstract: Abstract. The K-nearest neighbor (KNN) decision rule has been a ubiquitous classification tool with good scalability. Past experience has shown that the optimal choice of K depends upon the data, making it laborious to tune the parameter for different applications. We introduce a new metric that measures the informativeness of objects to be classified. When applied as a query-based distance metric to measure the closeness between objects, two novel KNN procedures, Locally Informative-KNN (LI-KNN) and Globally … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
96
0
5

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 149 publications
(101 citation statements)
references
References 11 publications
(22 reference statements)
0
96
0
5
Order By: Relevance
“…Despite the simplicity of the model, it employs the strategy of using the distance measure designed by the researcher, thus closely mimicking the human decision making [26]. Moreover, unlike other verifiers, the K-NN does not require a large number of parameters.…”
Section: Verificationmentioning
confidence: 99%
See 1 more Smart Citation
“…Despite the simplicity of the model, it employs the strategy of using the distance measure designed by the researcher, thus closely mimicking the human decision making [26]. Moreover, unlike other verifiers, the K-NN does not require a large number of parameters.…”
Section: Verificationmentioning
confidence: 99%
“…Almost all verifiers depend, in one way or the other, on a given distance measure. K-NN is advantageous in that it reflects the human decision making because it is only based on the distance measure designed by the researcher [26]. K-NN also does not involve a lot of parameters like other verifiers.…”
Section: Introductionmentioning
confidence: 99%
“…Se clasifica a la clase del patrón k más cercano, donde k es un entero positivo generalmente impar. Desde los artículos pioneros de este método hasta modificaciones al método original como los presentados en [24], [25], [26], [27], [28] en los cuales se presentan variación del método original o la combinación de clasificadores. Los cuales han demostrado en las diferentes aplicaciones que se ha utilizado este método, que es uno de más eficientes que existen para el reconocimiento y clasificación de patrones.…”
Section: Clasificador K-nnunclassified
“…KNN classifier is implemented with changing the number of nearest neighbors (K), different measurement distances, and different rules. Finally, the new sample is classified by majority vote (Kotsiantis 2007; Song et al 2007;Starzacher and Rinner 2008).…”
Section: Classificationmentioning
confidence: 99%