Proceedings of International Conference on Neural Networks (ICNN'96)
DOI: 10.1109/icnn.1996.549118
|View full text |Cite
|
Sign up to set email alerts
|

Classification with learning k-nearest neighbors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
63
0
2

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 121 publications
(82 citation statements)
references
References 3 publications
0
63
0
2
Order By: Relevance
“…Friedman [19] proposed an interesting way of adapting the metric based on a tree-structure interactive partitioning of the data. Laaksonen and Oja [4] proposed to improve the k-NN reference vectors using LVQ techniques. Atkenson, Moor and Schaal [10] discuss locally weighted regression techniques, minimal distance methods with various metric and kernel functions applied to approximation problems.…”
Section: Summary and Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…Friedman [19] proposed an interesting way of adapting the metric based on a tree-structure interactive partitioning of the data. Laaksonen and Oja [4] proposed to improve the k-NN reference vectors using LVQ techniques. Atkenson, Moor and Schaal [10] discuss locally weighted regression techniques, minimal distance methods with various metric and kernel functions applied to approximation problems.…”
Section: Summary and Discussionmentioning
confidence: 99%
“…4, therefore smaller weights of the norm mean that the network operates in an almost linear regime. Regularization methods add penalty terms to the error function forcing the weights to become small and thus smoothing the network approximation to the training data.…”
Section: A Framework For Minimal Distance Methodsmentioning
confidence: 99%
See 3 more Smart Citations