2017
DOI: 10.1007/s00362-017-0946-0
|View full text |Cite
|
Sign up to set email alerts
|

kNN estimation in functional partial linear modeling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(14 citation statements)
references
References 35 publications
0
12
0
Order By: Relevance
“…KNN is a nonparametric prediction algorithm. It searches for k most similar eigenvectors in the historical database to predict the future value [20]. e model has simple structure and high computational efficiency.…”
Section: Vmdmentioning
confidence: 99%
“…KNN is a nonparametric prediction algorithm. It searches for k most similar eigenvectors in the historical database to predict the future value [20]. e model has simple structure and high computational efficiency.…”
Section: Vmdmentioning
confidence: 99%
“… Estimation procedures: there are different methods and studies for estimation procedures of SFPLR model such as: fully automatic estimation procedure with the data-driven method and cross-validation for bandwidth selection of the smoothing parameter of nonparametric component [7], the asymptotic normality of linear part is studied [8], in a situations when the observation number of each subject is completely flexible and studying the convergence rate of the nonparametric part [9], the spline estimator of nonparametric part with studying their convergence rate [10], the two estimation procedure: 1-functional principal components regression (FPCR) and 2-functional ridge regression (FRR) based on the Tikhonov regularization [11][12][13], the new estimators for the parametric component called semiparametric least squares estimator (SLSE) [14], the nonparametric component approximated by a B-spline function [15], polynomial spline [16] and the slope function is estimated with the functional principal component basis [15,16], the k-nearest-neighbours (kNN) estimates with the local adaptive property that is better in the practice than kernel methods and some computations properties of this estimator [17][18][19], the Functional Semiparametric Additive Model via COmponent Selection and Smoothing Operator (FSAM-COSSO) in sparse setting [20], the sufficient dimension reduction methods such as sliced inverse regression (SIR) and sliced average variance estimation (SAVE) [21], the estimations are from the reproducing kernel Hilbert spaces (RKHS) [22], the frequentist and optimal model averaging [23], the latent group structure with K-means clustering [24], the joint asymptotic framework called joint Bahadur representation [25], the empirical likelihood estimation for non-functional high-dimension covariates [26], Sparse and penalized least-squares estimators [27] and the software for doing this analysis is available [28] .  Confidence Regions: Some papers have some sections for calculating confidence regions ,and we do not repeat them.…”
Section: Other Extensionsmentioning
confidence: 99%
“…Remark 1 Condition C1-C9 are not the weakest possible conditions, but they are imposed to facilitate the proof of Theorem. Conditions C1-C6 are required in the context of semi-functional partial linear model (see Ling et al, 2017). They are also a direct extension of Aneiros-Pérez and Vieu (2006).…”
Section: Asymptotic Propertiesmentioning
confidence: 99%
“…Aneiros-Pérez and Vieu (2008) extended the model to time series area. Ling et al (2017) proposed a k-nearest-neighbours (kNN) procedure and derived the asymptotic performances of kNN estimators. Aneiros et al (2015) extend the model to high-dimensional framework.…”
Section: Introductionmentioning
confidence: 99%