1978
DOI: 10.1007/bf01404567
|View full text |Cite
|
Sign up to set email alerts
|

Smoothing noisy data with spline functions

Abstract: Smoothing splines are well known to provide nice curves which smooth discrete, noisy data. We obtain a practical, effective method for estimating the optimum amount of smoothing from the data. Derivatives can be estimated from the data by differentiating the resulting (nearly) optimally smoothed spline. We consider the model yi=g(ti)+e~, i= 1, 2 ..... n, tie[0 , 1], where geW2 ~') = {f: j; f,, .... f(m-i~ abs. cont., f(m~ ~2 [0, 1 ] }, and the {el} are random errors with E e i=0, E eie~=a z 6~j. The error vari… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

6
1,067
0
7

Year Published

1996
1996
2014
2014

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 2,833 publications
(1,085 citation statements)
references
References 16 publications
6
1,067
0
7
Order By: Relevance
“…The best combination of α and p is obtained through a minimization of the generalized cross validation (GCV) statistic [Loader, 1999, p. 31]. Because it penalizes use of excess parameters, the GCV provides a better estimate of the predictive risk of a model than simply measures of goodness of fit [e.g., Craven and Wahba, 1979]. The regression theory provides the local error of the estimate and, consequently, the confidence and prediction intervals.…”
Section: Data and Forecasting Methodsmentioning
confidence: 99%
“…The best combination of α and p is obtained through a minimization of the generalized cross validation (GCV) statistic [Loader, 1999, p. 31]. Because it penalizes use of excess parameters, the GCV provides a better estimate of the predictive risk of a model than simply measures of goodness of fit [e.g., Craven and Wahba, 1979]. The regression theory provides the local error of the estimate and, consequently, the confidence and prediction intervals.…”
Section: Data and Forecasting Methodsmentioning
confidence: 99%
“…[28] and [31] investigate methods of determining the appropriate level of smoothing by analyzing properties of the input data. LQ problems with various types of boundary conditions are studied in [7] and [23] but to the extent of our knowledge, little work has been done on the type of periodic boundary conditions studied in this paper.…”
Section: D2 Related Work and Contributionsmentioning
confidence: 99%
“…Ideally, to make Algorithm D.4.1 truly automatic, it should include a way of determining the magnitude of ε based on the input data without knowledge of the contour. For regular smoothing splines such methods are presented in for instance [28] and [31]. Ongoing work includes adapting such a method to Problem D.3.1 and Problem D.3.2.…”
Section: D6 Conclusionmentioning
confidence: 99%
“…To apply the metric based approach to this task, we define the metric d in terms of the squared prediction error err(ŷ, y) = (ŷ − y) 2 with a square root normalization ϕ(z) = z 1/2 , as discussed in Section 2. To evaluate the efficacy of TRI in this problem we compared its performance to a number of standard model selection strategies, including: structural risk minimization, SRM (Cherkassky, Mulier, & Vapnik, 1997;Vapnik, 1996), RIC (Foster & George, 1994), SMS (Shibata, 1981), GCV (Craven & Wahba, 1979), BIC (Schwarz, 1978), AIC (Akaike, 1974), CP (Mallows, 1973), and FPE (Akaike, 1970). We also compared it to 10-fold cross validation, CVT (a standard hold-out method (Efron, 1979;Weiss & Kulikowski, 1991;Kohavi, 1995)).…”
Section: Example: Polynomial Regressionmentioning
confidence: 99%
“…The first class of methods we compared against were the same model selection methods considered before: 10-fold cross validation CVT, structural risk minimization SRM (Cherkassky, Mulier, & Vapnik, 1997), RIC (Foster & George, 1994), SMS (Shibata, 1981), GCV (Craven & Wahba, 1979), BIC (Schwarz, 1978), AIC (Akaike, 1974), CP (Mallows, 1973), FPE (Akaike, 1970), and the metric based model selection strategy, ADJ, introduced in Section 3.3. However, since none of the statistical methods, RIC, SMS, GCV, BIC, AIC, CP, FPE, performed competitively in our experiments, we report results only for GCV which performed the best among them.…”
Section: Example: Polynomial Regressionmentioning
confidence: 99%