Proceedings of the 36th IEEE Conference on Decision and Control
DOI: 10.1109/cdc.1997.657905
|View full text |Cite
|
Sign up to set email alerts
|

System identification using an over-parametrized model class-improving the optimization algorithm

Abstract: The use of an over-parametrized state-space model for system identification has some clear advantages: A single model structure covers the entire class of multivariable systems up to a given order. The over-parametrization also leads to the possibility to choose a numerically stable parametrization. During the parametric optimization the gradient calculations constitute the main computational part of the algorithm. Consequently using more than the minimal number of parameters required slows down the algorithm.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
30
0

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 33 publications
(30 citation statements)
references
References 11 publications
(12 reference statements)
0
30
0
Order By: Relevance
“…However, optimisation methods are known to exhibit poor convergence properties when they are used to minimise an identification problem which relies on a pseudo-canonical parametrisation [17]. More specifically, we showed in a recent study that the pseudo-canonical LMFD can sometimes lead to a numerical locking of the convergence.…”
Section: New Model Parametrisation Of Transfer Functionsmentioning
confidence: 91%
See 1 more Smart Citation
“…However, optimisation methods are known to exhibit poor convergence properties when they are used to minimise an identification problem which relies on a pseudo-canonical parametrisation [17]. More specifically, we showed in a recent study that the pseudo-canonical LMFD can sometimes lead to a numerical locking of the convergence.…”
Section: New Model Parametrisation Of Transfer Functionsmentioning
confidence: 91%
“…As shown for state-space representation in [17], using a representation with a higher number of parameters improves the performance of gradient-based optimisation methods when this choice is combined with search dimension reductions at each iteration of the optimisation. As we will see in Sect.…”
Section: New Model Parametrisation Of Transfer Functionsmentioning
confidence: 99%
“…However, this is taken care of in most quasi-newton solvers to ensure that the minimum is reached in a numerically stable way. The over-parametrization can also help the algorithm numerically, see McKelvey and Helmersson (1997).…”
Section: Lemma 1 For a Matrix A That Is Hurwitz It Holds Thatmentioning
confidence: 99%
“…This method determines the directions in the parameter space that do not change the cost function, and does not update the parameters in this direction by using an appropriate projection for the gradient in (5). The method is based on ideas used for identification of linear and linear parameter-varying state space systems by McKelvey and Helmersson (1997), Lee and Poolla (1999), and Verdult and Verhaegen (2000).…”
Section: Problem Descriptionmentioning
confidence: 99%