2019
DOI: 10.48550/arxiv.1903.10343
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sample Complexity Lower Bounds for Linear System Identification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 0 publications
0
7
0
Order By: Relevance
“…Proof. The proof of this result is essentially identical to the proof of Theorem 1 in [23] and we omit it here.…”
Section: G Lower Boundmentioning
confidence: 86%
See 2 more Smart Citations
“…Proof. The proof of this result is essentially identical to the proof of Theorem 1 in [23] and we omit it here.…”
Section: G Lower Boundmentioning
confidence: 86%
“…We base our analysis off the lower bound presented in [23]. A slight modification of their analysis to our situation yields the following result.…”
Section: G Lower Boundmentioning
confidence: 97%
See 1 more Smart Citation
“…trajectories. There is also a substantial body of literature on (sub-)optimal finite-sample concentration bounds for linear systems identified via least squares estimation (Simchowitz et al 2018;Jedra and Proutiere 2019;Sarkar and Rakhlin 2019;Jedra and Proutiere 2020;Sarkar, Rakhlin, and Dahleh 2020). These approaches offer fast learning rates but cannot guarantee stability of the identified systems for finite sample sizes.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, results in statistical learning theory, self-normalizing processes [3] and high dimensional probability [4], have motivated control, signal processing, and machine learning communities to explore finite-time properties of linear system estimates by least squares. Among topics of interest, we can find sample complexity bounds [5], 1 − δ probability bounds on parameter errors [6], confidence bounds [7,Chap. 20], Cramér-Rao lower bounds [8], and bounds on quadratic criterion cost deviation [9,10].…”
Section: Introductionmentioning
confidence: 99%