ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020
DOI: 10.1109/icassp40776.2020.9053095
|View full text |Cite
|
Sign up to set email alerts
|

Finite Sample Deviation and Variance Bounds for First Order Autoregressive Processes

Abstract: In this paper, we study non-asymptotic deviation bounds of the least squares estimator in Gaussian AR(n) processes. By relying on martingale concentration inequalities and a tail-bound for χ 2 distributed variables, we provide a concentration bound for the sample covariance matrix of the process output. With this, we present a problem-dependent finite-time bound on the deviation probability of any fixed linear combination of the estimated parameters of the AR(n) process. We discuss extensions and limitations o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 30 publications
0
4
0
Order By: Relevance
“…For instance, some assume that the process is auto-regressive (e.g. Lai and Wei (1983); Goldenshluger and Zeevi (2001); González and Rojas (2020)) or ergodic (e.g. Yu (1994); Duchi et al (2012)).…”
Section: Introductionmentioning
confidence: 99%
“…For instance, some assume that the process is auto-regressive (e.g. Lai and Wei (1983); Goldenshluger and Zeevi (2001); González and Rojas (2020)) or ergodic (e.g. Yu (1994); Duchi et al (2012)).…”
Section: Introductionmentioning
confidence: 99%
“…In this case, the modified algorithm can estimate unknown parameters and the unknown noise simultaneously, with less computational cost and better accuracy. Non-asymptotic deviations bounds for least-squared estimation in Gaussian AR processes have been recently studied (see [68]). The study relies on martingale concentration inequalities and tail bound for χ 2 -distributed variables; in the end, they provided a concentration bound for the sample covariance matrix of the process output.…”
Section: Robust Version Of Maximum Likelihood Methods With Whitening:...mentioning
confidence: 99%
“…We mention two recent works related to this special case. Finite sample analysis of OLS estimator for AR models with known order appeared in [32], where the parameter error in terms of ℓ ∞norm is analyzed as opposed to the ℓ 2 -norm error in our case. Non-asymptotic results for using lasso estimator to learn AR models appeared in [33], which requires mixing time and a slightly different excitation condition, which can be less interpretable for control audience than system-related properties in our work.…”
Section: A Special Casesmentioning
confidence: 99%