2010
DOI: 10.1109/tit.2010.2040857
|View full text |Cite
|
Sign up to set email alerts
|

Rates of Convergence of the Functional $k$-Nearest Neighbor Estimate

Abstract: Let F be a separable Banach space, and let (X, Y ) be a random pair taking values in F × R. Motivated by a broad range of potential applications, we investigate rates of convergence of the k-nearest neighbor estimate r n (x) of the regression function r(x) = E[Y |X = x], based on n independent copies of the pair (X, Y ). Using compact embedding theory, we present explicit and general finite sample bounds on the expected squared difference E[r n (X) − r(X)] 2 , and particularize our results to classical functio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
71
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 77 publications
(72 citation statements)
references
References 17 publications
1
71
0
Order By: Relevance
“…The convergence and consistency properties of trees and random forests have been studied by, among others, Biau [2012], Biau et al [2008], Breiman [2004], Breiman et al [1984], Meinshausen [2006], Scornet et al [2015], Wager and Walther [2015], and Zhu et al [2015]. Meanwhile, their sampling variability has been analyzed by Duan [2011], Lin and Jeon [2006], Mentch and Hooker [2016], Sexton andLaake [2009], and.…”
Section: Theoretical Backgroundmentioning
confidence: 99%
“…The convergence and consistency properties of trees and random forests have been studied by, among others, Biau [2012], Biau et al [2008], Breiman [2004], Breiman et al [1984], Meinshausen [2006], Scornet et al [2015], Wager and Walther [2015], and Zhu et al [2015]. Meanwhile, their sampling variability has been analyzed by Duan [2011], Lin and Jeon [2006], Mentch and Hooker [2016], Sexton andLaake [2009], and.…”
Section: Theoretical Backgroundmentioning
confidence: 99%
“…For b-NN , the recent papers by Biau and colleagues (15, 17, 19, 20) showed consistency under general conditions when operating as regression estimators. Therefore, these machines are also consistent when acting as probability machines.…”
Section: Methodsmentioning
confidence: 96%
“…While being intuitive and simple to implement, k-nearest neighbor regression is well-understood from the point of view of theory as well, see e.g. [2], [3], [13], and the references therein for an overview of the most important theoretical results. These theoretical results are also justified by empirical studies: for example, in their recent paper, Stensbo-Smidt et al found that nearest neighbor regression outperforms model-based prediction of star formation rates [30], while Hu et al showed that a model based on k-nearest neighbor regression is able to estimate the capacity of lithium-ion batteries [19].…”
Section: Ecknn: K-nearest Neighbor Regression With Error Correctionmentioning
confidence: 99%