Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2018
DOI: 10.1109/lcsys.2017.2720969
|View full text |Cite
|
Sign up to set email alerts
|

Finite-Sample System Identification: An Overview and a New Correlation Method

Abstract: Finite-sample system identification algorithms can be used to build guaranteed confidence regions for unknown model parameters under mild statistical assumptions. It has been shown that in many circumstances these rigorously built regions are comparable in size and shape to those that could be built by resorting to the asymptotic theory. The latter sets are, however, not guaranteed for finite samples and can sometimes lead to misleading results. The general principles behind finite-sample methods make them vir… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 48 publications
(23 citation statements)
references
References 11 publications
0
22
0
Order By: Relevance
“…Consider those two asymptotic variance matrices cov(θ N ) and cov(η N ) obtained in above section 6 through our own mathematical derivations, the idea case is that the asymptotic variance matrix cov(θ N ) or cov(η N ) may be zero or approach to zero with N → ∞. So those two asymptotic variance matrices cov(θ N ) and cov(η N ) can measure the identification quality or testify whether our parameter estimators are efficient or asymptotic efficient (Care et al, 2018). In order to achieve the efficient or asymptotic efficient property, the asymptotic variance matrix obtained here is an important tool to consider.…”
Section: Applicationmentioning
confidence: 91%
“…Consider those two asymptotic variance matrices cov(θ N ) and cov(η N ) obtained in above section 6 through our own mathematical derivations, the idea case is that the asymptotic variance matrix cov(θ N ) or cov(η N ) may be zero or approach to zero with N → ∞. So those two asymptotic variance matrices cov(θ N ) and cov(η N ) can measure the identification quality or testify whether our parameter estimators are efficient or asymptotic efficient (Care et al, 2018). In order to achieve the efficient or asymptotic efficient property, the asymptotic variance matrix obtained here is an important tool to consider.…”
Section: Applicationmentioning
confidence: 91%
“…This section presents the proposed framework to quantify the uncertainty of kernel-based estimates. It is inspired by and builds on recent results from finite-sample system identification, such as the SPS and DP methods (Campi and Weyer 2005;Csáji 2016;Kolumbán 2016;Carè et al 2018). Novelties with respect to these approaches are, e.g., that our framework considers nonparametric regression and does not require the "true" function to be in the model class.…”
Section: Non-asymptotic Distribution-free Frameworkmentioning
confidence: 99%
“…Here, we propose a non-asymptotic, distribution-free approach to quantify the uncertainty of kernel-based models, which can be used for hypothesis testing and confidence region constructions. We build on recent developments in finite-sample system identification (Campi and Weyer 2005;Carè et al 2018), more specifically, we build on the Sign-Perturbed Sums (SPS) algorithm and its generalizations, the Data Peturbation (DP) methods (Kolumbán 2016).…”
Section: Introductionmentioning
confidence: 99%
“…In the model-based methods, Kalman filter-based SOC estimation methods have some advantages, such as self-correction, online computation, and complexity reduction. Kalman filter was first proposed to estimate the state of linear systems [12], and then, in order to apply it into nonlinear systems, the extended Kalman filter and unscented Kalman filter were developed [11]. Meanwhile, the date-driven methods typically include the lookup table method, matching learning-based method, artificial neural networks, and support vector machine [13].…”
Section: Introductionmentioning
confidence: 99%