2007
DOI: 10.1002/cjs.5550350410
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear functional models for functional responses in reproducing kernel hilbert spaces

Abstract: Summary. An extension of reproducing kernel Hilbert space (RKHS) theory provides a new framework for modeling functional regression models with functional responses. The approach only presumes a general nonlinear regression structure as opposed to previously studied linear regression models. Generalized cross-validation (GCV) is proposed for automatic smoothing parameter estimation. The new RKHS estimate is applied to both simulated and real data as illustrations.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
43
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 56 publications
(43 citation statements)
references
References 10 publications
0
43
0
Order By: Relevance
“…There are relatively few published recommendations in the statistical literature on how to construct k1(·,·). For example, Lian [] writes “[...]the construction of k 1 in general is difficult and a search of the literature does not seem to provide us with any clues about how to construct a positive definite kernel in general.” Nonetheless, if we shift our attention to the machine learning literature, we see that k1(t,ti)=G(t,ti), where G(t,ti) is a Green's function of the linear differential operator Ly(t) [Fasshauer, ; Fasshauer and Ye, ; Poggio and Girosi, ; Rasmussen and Williams, ]. Note that the Green's function also depends on the boundary conditions.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…There are relatively few published recommendations in the statistical literature on how to construct k1(·,·). For example, Lian [] writes “[...]the construction of k 1 in general is difficult and a search of the literature does not seem to provide us with any clues about how to construct a positive definite kernel in general.” Nonetheless, if we shift our attention to the machine learning literature, we see that k1(t,ti)=G(t,ti), where G(t,ti) is a Green's function of the linear differential operator Ly(t) [Fasshauer, ; Fasshauer and Ye, ; Poggio and Girosi, ; Rasmussen and Williams, ]. Note that the Green's function also depends on the boundary conditions.…”
Section: Methodsmentioning
confidence: 99%
“…Other choices of basis functions can also be used with corresponding changes to penalized terms. Possible choices include, but are not limited to, (a) truncated power basis (tki)+p, (b) O'Sullivan splines [Wand and Ormerod, ], (c) thin plate splines [Ivanescu et al., ], or (d) the Gaussian kernel [Lian, ].…”
Section: Methodsmentioning
confidence: 99%
“…Let FK be the reproducing kernel Hilbert space generated by the kernel Kfalse(·,·false). See Lian () for more about the RKHS FK. It can be shown that the solution of with scriptF=FK takes the form ffalse(Xfalse)=c0+j=1ncjKfalse(Xfalse(·false),Xˆjfalse(·false)false).…”
Section: Functional Nonparametric Regressionmentioning
confidence: 99%
“…See the discussion paper by Ramsay & Dalzell () for an introduction to functional linear regression. For methods extending multivariate nonparametric regression to functional data, see Ferraty & Vieu (), Horvath & Kokoszka (), Ferraty, Mas, & Vieu (), Ferraty & Vieu (), Lian (), Kadri et al (), and references therein. In particular, the book Ferraty & Vieu () introduces various methodologies for studying functional data in a nonparametric way based on kernel‐type estimators.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation