2019
DOI: 10.2139/ssrn.3515288
|View full text |Cite
|
Sign up to set email alerts
|

A Higher-Order Correct Fast Moving-Average Bootstrap for Dependent Data

Abstract: We develop and implement a novel fast bootstrap for dependent data. Our scheme is based on the i.i.d. resampling of the smoothed moment indicators. We characterize the class of parametric and semi-parametric estimation problems for which the method is valid. We show the asymptotic refinements of the proposed procedure, proving that it is higher-order correct under mild assumptions on the time series, the estimating functions, and the smoothing kernel. We illustrate the applicability and the advantages of our p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 98 publications
(174 reference statements)
0
2
0
Order By: Relevance
“…For GMM-based estimation of an instrumental variable linear regression model, Inoue and Shintani (2006) demonstrate the MBB yields asymptotic refinements for a studentized t-statistic for a linear combination of the parameter vector and the distribution of the Hansen (1982) J-statistic. Since the first draft of the paper was written and after its original submission, we became aware of the recent QUASI-MAXIMUM LIKELIHOOD AND THE KERNEL BLOCK BOOTSTRAP 379 paper La Vecchia et al (2020) which develops a fast bootstrap method in the GMM framework similar to KBB explored here for the construction of a confidence region for the full parameter vector. Their construction of a confidence region inverts the non-rejection region of a test based on GEL estimation (Smith, 2011) utilizing an asymptotically pivotal Hansen (1982) J-statistic with bootstrap critical values.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…For GMM-based estimation of an instrumental variable linear regression model, Inoue and Shintani (2006) demonstrate the MBB yields asymptotic refinements for a studentized t-statistic for a linear combination of the parameter vector and the distribution of the Hansen (1982) J-statistic. Since the first draft of the paper was written and after its original submission, we became aware of the recent QUASI-MAXIMUM LIKELIHOOD AND THE KERNEL BLOCK BOOTSTRAP 379 paper La Vecchia et al (2020) which develops a fast bootstrap method in the GMM framework similar to KBB explored here for the construction of a confidence region for the full parameter vector. Their construction of a confidence region inverts the non-rejection region of a test based on GEL estimation (Smith, 2011) utilizing an asymptotically pivotal Hansen (1982) J-statistic with bootstrap critical values.…”
Section: Introductionmentioning
confidence: 99%
“…Their construction of a confidence region inverts the non-rejection region of a test based on GEL estimation (Smith, 2011) utilizing an asymptotically pivotal Hansen (1982) J-statistic with bootstrap critical values. The La Vecchia et al (2020) procedure is fast not requiring parameter estimation at each bootstrap replication and achieves an asymptotic refinement.…”
Section: Introductionmentioning
confidence: 99%