2007
DOI: 10.1016/j.patrec.2006.07.002
|View full text |Cite
|
Sign up to set email alerts
|

Improving nearest neighbor rule with a simple adaptive distance measure

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
68
0
10

Year Published

2008
2008
2022
2022

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 217 publications
(78 citation statements)
references
References 6 publications
0
68
0
10
Order By: Relevance
“…This closeness is defined by means of the distance d which turns the election of this metric essential, since different metrics will most likely generate different classifications. As a consequence the election of the metric is widely discussed in the literature, as shown in [83]. Note that the other main drawback that this technique presents is the selection of the number of neighbors to consider [84].…”
Section: Related Workmentioning
confidence: 99%
“…This closeness is defined by means of the distance d which turns the election of this metric essential, since different metrics will most likely generate different classifications. As a consequence the election of the metric is widely discussed in the literature, as shown in [83]. Note that the other main drawback that this technique presents is the selection of the number of neighbors to consider [84].…”
Section: Related Workmentioning
confidence: 99%
“…• The Non IS family includes three classifiers that retain all the training instances as knowledge base: KNN, CenterNN [28] and KNNAdaptive [29].…”
Section: Methodsmentioning
confidence: 99%
“…The class label of the unknown sample is then predicted to be the most frequent one occurring in the k nearest-neighbours. The 1NN classifier is well explored in the literature and has been proved to have good classification performance on a wide range of real-world data sets [1][2][3][4].…”
Section: Introductionmentioning
confidence: 99%