2013
DOI: 10.1016/j.patcog.2012.06.009
|View full text |Cite
|
Sign up to set email alerts
|

Nearest neighbor classifier generalization through spatially constrained filters

Abstract: It is widely understood that the performance of the nearest neighbor (NN) rule is dependent on: (i) the way distances are computed between di↵erent examples, and (ii) the type of feature representation used. Linear filters are often used in computer vision as a pre-processing step, to extract useful feature representations. In this paper we demonstrate an equivalence between (i) and (ii) for NN tasks involving weighted Euclidean distances. Specifically, we demonstrate how the application of a bank of linear fi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…However, in face recognition problem, the registrant person number is always increasing, and it makes the transitional methods not suitable to this problem anymore. Up to now, on one hand, the most popular classification method used in this field is nearest neighbors classifier [14]- [16]. Moreover, it can also be transferred to a binary classification problem, and the traditional binary classification methods, such as SVM [17]- [19], and Adaboost [20]- [22], can also be used in this problem.…”
Section: Introductionmentioning
confidence: 99%
“…However, in face recognition problem, the registrant person number is always increasing, and it makes the transitional methods not suitable to this problem anymore. Up to now, on one hand, the most popular classification method used in this field is nearest neighbors classifier [14]- [16]. Moreover, it can also be transferred to a binary classification problem, and the traditional binary classification methods, such as SVM [17]- [19], and Adaboost [20]- [22], can also be used in this problem.…”
Section: Introductionmentioning
confidence: 99%