2016
DOI: 10.1214/15-aos1395
|View full text |Cite
|
Sign up to set email alerts
|

Classification in general finite dimensional spaces with the k-nearest neighbor rule

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

3
77
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 48 publications
(80 citation statements)
references
References 27 publications
3
77
0
Order By: Relevance
“…We derive an excess risk bound that generalizes the worst-case results of [7] but also imply faster rates when the distance f − g is large. This acceleration is a consequence of the nice properties of the margin (see also, e.g., [2] and [14]).…”
Section: Our Functional Modelmentioning
confidence: 96%
See 3 more Smart Citations
“…We derive an excess risk bound that generalizes the worst-case results of [7] but also imply faster rates when the distance f − g is large. This acceleration is a consequence of the nice properties of the margin (see also, e.g., [2] and [14]).…”
Section: Our Functional Modelmentioning
confidence: 96%
“…This setting has been extensively studied so far. Popular classification procedures that are now theoretically well understood include the ERM method [27,2], the k-nearest neighbors algorithm [16,9,3,14], support vector machines [32], or random forests [4], just to name a few.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…We present theoretical guarantees for k-nearest neighbor, fixed-radius near neighbor, and kernel regression where the data reside in a metric space. The proofs borrow heavily from the work by Chaudhuri and Dasgupta (2014) with some influence from the work by Gadat et al (2016). These authors actually focus on classification, but proof ideas translate over to the regression setting.…”
Section: Introductionmentioning
confidence: 99%