2005
DOI: 10.1093/biomet/92.2.419
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical models for assessing variability among functions

Abstract: In many applications of functional data analysis, summarising functional variation based on fits, without taking account of the estimation process, runs the risk of attributing the estimation variation to the functional variation, thereby overstating the latter. For example, the first eigenvalue of a sample covariance matrix computed from estimated functions may be biased upwards. We display a set of estimated neuronal Poisson-process intensity functions where this bias is substantial, and we discuss two metho… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
44
0

Year Published

2005
2005
2014
2014

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 48 publications
(44 citation statements)
references
References 24 publications
0
44
0
Order By: Relevance
“…While we have obtained very good results using the Bayes factor, it does require both functions to use the same knot set (as would the likelihood ratio test) which may be restrictive. One reason we are concerned about using the same knot set for both functions is that in our recent study of methods for assessing variability among many curves we found [2], somewhat surprisingly, that a random-coe cient hierarchical model that assumed the same knot sets among all curves did not perform as well as an alternative approach based on ÿtting the curves separately. Therefore, in Section 4, we introduce our second procedure, which begins with separate BARS ÿts for f 1 (t) and f 2 (t) and produces a p-value based on an analogue of Hotelling's T 2 for testing equality of two multivariate Normal means.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…While we have obtained very good results using the Bayes factor, it does require both functions to use the same knot set (as would the likelihood ratio test) which may be restrictive. One reason we are concerned about using the same knot set for both functions is that in our recent study of methods for assessing variability among many curves we found [2], somewhat surprisingly, that a random-coe cient hierarchical model that assumed the same knot sets among all curves did not perform as well as an alternative approach based on ÿtting the curves separately. Therefore, in Section 4, we introduce our second procedure, which begins with separate BARS ÿts for f 1 (t) and f 2 (t) and produces a p-value based on an analogue of Hotelling's T 2 for testing equality of two multivariate Normal means.…”
Section: Introductionmentioning
confidence: 99%
“…DiMatteo et al showed that BARS could reduce mean squared error below other existing methods, and this new technology has been used in a variety of applications in neurophysiology, imaging, EEG analysis, and genetics [1][2][3][4][5][6][7]. Furthermore, BARS has been implemented in C, with calling functions in R and S, in publically available software [8].…”
Section: Introductionmentioning
confidence: 99%
“…and the goal is to describe what these functions have in common and how they vary (Ramsay and Silverman, 2005;Behseta et al, 2005;Cheng et al, 2013). In that context functional additive models are very natural, and the simplest is the following two-level model:…”
Section: Introductionmentioning
confidence: 99%
“…The two-level model appears on its own, or as an essential building block in more complex models, which may include several hierarchical levels (as in functional ANOVA, Ramsay and Silverman, 2005;Kaufman and Sain, 2009;Sain et al, 2011), time shifts (Cheng et al, 2013;Kneip and Ramsay, 2008;Telesca and Inoue, 2008), or non-Gaussian observations (Behseta et al, 2005). Functional PCA (Ramsay and Silverman, 2005), which seeks to characterise the distribution of the difference functions d i , is a closely related variant .…”
Section: Introductionmentioning
confidence: 99%
“…To utilize Bayesian Adaptive Regression Splines (BARS) [7][8][9] to estimate the underlying shape functions g k for k=1,…,K. BARS has been shown to provide parsimonious fits (e.g.…”
mentioning
confidence: 99%