2015
DOI: 10.1016/j.ifacol.2015.12.224
|View full text |Cite
|
Sign up to set email alerts
|

Sequential Monte Carlo Methods for System Identification**This work was supported by the projects Learning of complex dynamical systems (Contract number: 637-2014-466) and Probabilistic modeling of dynamical systems (Contract number: 621-2013-5524), both funded by the Swedish Research Council.

Abstract: One of the key challenges in identifying nonlinear and possibly nonGaussian state space models (SSMs) is the intractability of estimating the system state. Sequential Monte Carlo (SMC) methods, such as the particle filter (introduced more than two decades ago), provide numerical solutions to the nonlinear state estimation problems arising in SSMs. When combined with additional identification techniques, these algorithms provide solid solutions to the nonlinear system identification problem. We describe two gen… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
3
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 29 publications
(6 citation statements)
references
References 52 publications
(43 reference statements)
0
3
0
Order By: Relevance
“…In this paper, we shall assume that 𝜈 𝜽 follows a zeromean Normal distribution with fixed standard deviation 𝜎 𝜈 (see e.g. [16,18,61]).…”
Section: Appendixmentioning
confidence: 99%
“…In this paper, we shall assume that 𝜈 𝜽 follows a zeromean Normal distribution with fixed standard deviation 𝜎 𝜈 (see e.g. [16,18,61]).…”
Section: Appendixmentioning
confidence: 99%
“…First, the computations of the score function require evaluations of the log-likelihood function and its gradient. Depending on the model, this may be a very challenging task; see [3,35,53]. Second, the performance is sensitive to the distributional assumptions; e.g., the optimal properties of the MLE are only valid if the distribution is correctly specified, otherwise the estimator may even become inconsistent; see for example [68].…”
Section: Assumptionmentioning
confidence: 99%
“…For example, approximate Maximum Likelihood (ML) and Bayesian methods have been developed; see, for example, [29,50,54,69,41,66,64,20,19]. They are mainly based on sequential Monte Carlo (a.k.a particle filters/smoothers) and Markov chain Monte Carlo approximations (see [53]), and have been shown to provide impressive results on several examples and benchmark problems. However, depending on the application, they may be computationally expensive or even infeasible.…”
Section: Introductionmentioning
confidence: 99%
“…The sequential Monte‐Carlo (SMC) methods are well suited for adaptation to grey‐box identification. These algorithms identify nonlinear systems in state space form, based on Bayesian equations 8‐10 . Partly known physical parameters can be included in the dynamic models of some of these Bayesian algorithms, e.g.…”
Section: Introductionmentioning
confidence: 99%