2021
DOI: 10.1111/jtsa.12627
|View full text |Cite
|
Sign up to set email alerts
|

Regularized estimation of high‐dimensional vector autoregressions with weakly dependent innovations

Abstract: There has been considerable advance in understanding the properties of sparse regularization procedures in high‐dimensional models. In time series context, it is mostly restricted to Gaussian autoregressions or mixing sequences. We study oracle properties of LASSO estimation of weakly sparse vector‐autoregressive models with heavy tailed, weakly dependent innovations. In contrast to current literature, our innovation process satisfy an L1 mixingale type condition on the centered conditional covariance matrices… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 15 publications
(11 citation statements)
references
References 36 publications
(65 reference statements)
0
10
0
Order By: Relevance
“…Wong et al (2020) derived finite-sample guarantees for the LASSO in a misspecified VAR model involving β-mixing process with sub-Weibull marginal distributions. Masini et al (2019) derive equation-wise error bounds for the LASSO estimator of weakly sparse V AR(p) in mixingale dependence settings, that include models with conditionally heteroscedastic innovations.…”
Section: Theoretical Propertiesmentioning
confidence: 99%
“…Wong et al (2020) derived finite-sample guarantees for the LASSO in a misspecified VAR model involving β-mixing process with sub-Weibull marginal distributions. Masini et al (2019) derive equation-wise error bounds for the LASSO estimator of weakly sparse V AR(p) in mixingale dependence settings, that include models with conditionally heteroscedastic innovations.…”
Section: Theoretical Propertiesmentioning
confidence: 99%
“…in that case, such that typical m.d.s. assumptions as used for instance in Medeiros and Mendes (2016) and Masini et al (2019) do not allow for dynamic miss-specification. Wong et al (2020) also allow for miss-specification by allowing for mixing errors, which is a subset of the error processes allowed here.…”
Section: The High-dimensional Linear Modelmentioning
confidence: 99%
“…Exceptions are Medeiros and Mendes (2016) Masini et al (2019) and Wong et al (2020). Medeiros and Mendes (2016) consider the adaptive lasso for sparse, high-dimensional time series models and show that it is model selection consistent and has the oracle property, even when the errors are non-Gaussian and conditionally heteroskedastic.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Medeiros and Mendes (2016) studied time series lasso with non-Gaussian and heteroskedastic covariates. Masini et al (2019) derived an oracle inequality for sparse VAR with fat probability tail and various dependent settings. Wong et al (2020) studied time series lasso with general tail probability and dependence by relaxing a restriction in Basu and Michailidis (2015).…”
mentioning
confidence: 99%