We prove that the Hodrick-Prescott Filter (HPF), a commonly used method for smoothing econometric time series, is a special case of a linear penalized spline model with knots placed at all observed time points (except the first and last) and uncorrelated residuals. This equivalence then furnishes a rich variety of existing data-driven parameter estimation methods, particularly restricted maximum likelihood (REML) and generalized cross-validation (GCV). This has profound implications for users of HPF who have hitherto typically relied on subjective choice, rather than estimation, for the smoothing parameter. By viewing estimates as roots of an appropriate quadratic estimating equation, we also present a new approach for constructing confidence intervals for the smoothing parameter. The method is akin to a parametric bootstrap where Monte Carlo simulation is replaced by saddlepoint approximation, and provides a fast and accurate alternative to exact methods, when they exist, e.g. REML. More importantly, it is also the only computationally feasible method when no other methods, exact or otherwise, exist, e.g. GCV. The methodology is demonstrated on the Gross National Product (GNP) series originally analyzed by Hodrick and Prescott (1997). With proper attention paid to residual correlation structure, we show that REML-based estimation delivers an appropriate smooth for both the GNP series and its returns.
No abstract
Active bone marrow is one of the more radiosensitive tissues in the human body and, hence, it is important to predict and possibly avoid myelotoxicity in radionuclide therapies. The MIRD schema currently used to calculate marrow dose generally requires knowledge of the patient's total skeletal active marrow mass-a value that, at present, cannot be directly measured. Conceptually, the active marrow mass in a given skeletal region may be obtained given knowledge of the trabecular spongiosa volume (SV) of the bone site. A recent study has established a multiple regression model to easily calculate total skeletal SV (or TSSV) based on simple skeletal measurements obtained from a pelvic CT scan or radiograph. This model, based on data from only 20 cadavers, did not account for sex differences in TSSV. This study thus extends this work toward sex-specific models. Methods: Twenty male and 20 female cadavers were subjected to whole-body CT. Bone sites containing active bone marrow were manually segmented to obtain SV at each site. In addition to age and height, 14 CT-based skeletal measurements were recorded for each cadaver. Multiple linear regression techniques were used to determine the best subset of measurements that allowed an accurate prediction of TSSV. Results: A pooled model (R 2 5 0.76) and a sex-specific model (R 2 5 0.79) are provided. A leave-one-out analysis reveals that these models predict total SV with less than 10% error for 50%-70% of subjects, and with less than 20% error for 70%-90% of subjects. Tables were constructed that provide the percent distribution of SV in active-marrow containing bone sites for both males and females. Conclusion: This study provides models that can be used to simply, yet accurately, predict total SV in individuals within the clinical setting. The models require only 2 or 3 skeletal measurements that can be easily measured on a pelvic CT scan. Even though this study does not conclusively determine which model is best at predicting TSSV, the sex-specific model is most consistent at providing reasonable estimates of TSSV. This study also explains how the predictive TSSV model can be used to estimate patient-specific active bone marrow mass under the assumption of reference values of marrow volume fraction and bone marrow cellularity by skeletal site.
Commodity price volatility has created concerns for central bank policy-makers. Recent commodity prices peaked in the consequences of the financial crisis of 2007, and they have remained relatively volatile since. As they are often seen as being connected in a cause and effect relationship with inflation and real output, the driving forces behind commodity price volatility are necessary for the conduct of monetary policy. Using an autoregressive moving average with an exponential generalized autoregressive conditional heteroscedastic (ARMA-EGARCH) process, we extract the conditional variance series to find volatility spillover between monetary policy and commodity price index. The findings show that the volatility of agricultural commodity price index and other commodities price indices overshoot the long-run equilibrium price in response to an impulse in monetary policy. K E Y W O R D SARMA-EGARCH model, commodity prices, monetary policy, overshooting hypothesis, VECM approach
We propose autoregressive moving average (ARMA) and generalized autoregressive conditional heteroscedastic (GARCH) models driven by asymmetric Laplace (AL) noise. The AL distribution plays, in the geometric-stable class, the analogous role played by the normal in the alpha-stable class, and has shown promise in the modelling of certain types of financial and engineering data. In the case of an ARMA model we derive the marginal distribution of the process, as well as its bivariate distribution when separated by a finite number of lags. The calculation of exact confidence bands for minimum mean-squared error linear predictors is shown to be straightforward. Conditional maximum likelihood-based inference is advocated, and corresponding asymptotic results are discussed. The models are particularly suited for processes that are skewed, peaked, and leptokurtic, but which appear to have some higher order moments. A case study of a fund of real estate returns reveals that AL noise models tend to deliver a superior fit with substantially less parameters than normal noise counterparts, and provide both a competitive fit and a greater degree of numerical stability with respect to other skewed distributions.
We propose an easy to implement method for making small sample parametric inference about the root of an estimating equation expressible as a quadratic form in normal random variables. It is based on saddlepoint approximations to the distribution of the estimating equation whose unique root is a parameter's maximum likelihood estimator (MLE), while substituting conditional MLEs for the remaining (nuisance) parameters. Monotoncity of the estimating equation in its parameter argument enables us to relate these approximations to those for the estimator of interest. The proposed method is equivalent to a parametric bootstrap percentile approach where Monte Carlo simulation is replaced by saddlepoint approximation. It finds applications in many areas of statistics including, nonlinear regression, time series analysis, inference on ratios of regression parameters in linear models and calibration. We demonstrate the method in the context of some classical examples from nonlinear regression models and ratios of regression parameter problems. Simulation results for these show that the proposed method, apart from being generally easier to implement, yields confidence intervals with lengths and coverage probabilities that compare favourably with those obtained from several competing methods proposed in the literature over the past half-century. Copyright (c) 2008 Board of the Foundation of the Scandinavian Journal of Statistics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.