Abstract:The recently proposed Dynamic Regressor Extension and Mixing (DREM) procedure has been proven to enhance transient performance in online parameter estimation and it has been successfully applied to a variety of adaptive control problems and applications. However, to use this procedure, a linear operator has to be chosen to perform the dynamic extension. A poor choice of the operator can reduce excitation of signals and hence it can compromise convergence properties. This paper presents a systematic selection o… Show more
“…Remark 2. As pointed out in [10] applying Cramer's law we have that Y i = det{Φ Yi } where Φ Yi is the matrix obtained replacing the i-column of Φ with Y .…”
Section: B Generation Of M Scalar Lre Via Dremmentioning
confidence: 99%
“…Proposition 6. Consider the scalar CT LREs (7) and the gradient-descent estimator (10). Fix a constant ν i ∈ (0, 1) and assume there exists a time t c ∈ R >0 such that…”
Section: Ct Drem Estimators With Alert Finite-time Convergencementioning
confidence: 99%
“…We consider the simplest case of a scalar system y(t) = ∆(t)θ and simulate the gradient estimator (10), that is,…”
Section: A Transient Performance Improvement Of Propositionmentioning
We present some new results on the dynamic regressor extension and mixing parameter estimators for linear regression models recently proposed in the literature. This technique has proven instrumental in the solution of several open problems in system identification and adaptive control. The new results include: (i) a unified treatment of the continuous and the discrete-time cases; (ii) the proposal of two new extended regressor matrices, one which guarantees a quantifiable transient performance improvement, and the other exponential convergence under conditions that are strictly weaker than regressor persistence of excitation; and (iii) an alternative estimator ensuring convergence in finite-time whose adaptation gain, in contrast with the existing one, does not converge to zero. Simulations that illustrate our results are also presented.
“…Remark 2. As pointed out in [10] applying Cramer's law we have that Y i = det{Φ Yi } where Φ Yi is the matrix obtained replacing the i-column of Φ with Y .…”
Section: B Generation Of M Scalar Lre Via Dremmentioning
confidence: 99%
“…Proposition 6. Consider the scalar CT LREs (7) and the gradient-descent estimator (10). Fix a constant ν i ∈ (0, 1) and assume there exists a time t c ∈ R >0 such that…”
Section: Ct Drem Estimators With Alert Finite-time Convergencementioning
confidence: 99%
“…We consider the simplest case of a scalar system y(t) = ∆(t)θ and simulate the gradient estimator (10), that is,…”
Section: A Transient Performance Improvement Of Propositionmentioning
We present some new results on the dynamic regressor extension and mixing parameter estimators for linear regression models recently proposed in the literature. This technique has proven instrumental in the solution of several open problems in system identification and adaptive control. The new results include: (i) a unified treatment of the continuous and the discrete-time cases; (ii) the proposal of two new extended regressor matrices, one which guarantees a quantifiable transient performance improvement, and the other exponential convergence under conditions that are strictly weaker than regressor persistence of excitation; and (iii) an alternative estimator ensuring convergence in finite-time whose adaptation gain, in contrast with the existing one, does not converge to zero. Simulations that illustrate our results are also presented.
“…Для простоты дальнейшего анализа и изложения, не теряя общности результатов, рассмотрим случай n = 2. В соответствии с [1][2][3], введем оператор задержки:…”
Section: теоретическое описание проблемыunclassified
“…Процедура динамического расширения и смешивания DREM (Dynamic Regressor Extension and Mixing) [1][2][3] получила широкое распространение при решении задачи оценки постоянных параметров линейных регрессионных уравнений. Благодаря применению данной процедуры удалось получить решение ряда ранее открытых проблем теории управления [4][5][6][7][8][9].…”
Предмет исследования. Исследована применимость процедуры расширения и смешивания регрессора (DREM, Dynamic Regressor Extension and Mixing) для идентификации интервально заданных параметров линейной регрессии. В отличие от известных работ, показано, что применение базовой процедуры DREM в задаче идентификации интервально-заданных параметров приводит к генерации на некоторых интервалах времени скалярных возмущенных регрессий, что существенно ухудшает качество получаемых оценок. Метод. Для решения обнаруженной проблемы предложен новый подход к динамическому расширению регрессора, основанный на интервальной интегральной фильтрации с экспоненциальным забыванием и сбросом. Основные результаты. Предложена модифицированная процедура DREM, которая, в отличие от базовой, позволяет генерировать скалярные регрессии с регулируемым уровнем возмущения. В рамках численных экспериментов по идентификации интервально заданных параметров подтверждено: полученное описание возмущенных скалярных регрессий, наличие выбросов по оценкам параметров таких регрессий при применении градиентного и FCT-Dконтура (Finite Convergence Time DREM), а также показана возможность регулирования величины выбросов по оценкам при использовании разработанной модифицированной процедуры DREM. Практическая значимость. Процедура может быть применена в задачах построения систем идентификации и адаптивного управления. Ключевые слова идентификация, линейная регрессия, DREM, FCT-D, интервальная фильтрация, конечное возбуждение Благодарности Работа выполнена при финансовой поддержке РФФИ (проект № 18-47-310003 р_а).
Summary
The scope of this research is a problem of parameters identification of a linear time‐invariant plant, which (1) input signal is not frequency‐rich, (2) is subjected to initial conditions and external disturbances. The memory regressor extension (MRE) scheme, in which a specially derived differential equation is used as a filter, is applied to solve the above‐stated problem. Such a filter allows us to obtain a bounded regressor value, for which a condition of the initial excitation (IE) is met. Using the MRE scheme, the recursive least‐squares method with the forgetting factor is used to derive an adaptation law. The following properties have been proved for the proposed approach. If the IE condition is met, then: (1) the parameter error of identification is bounded and converges to zero exponentially (if there are no external disturbances) or to a set (in the case of them) with an adjustable rate, (2) the parameters adaptation rate is a finite value. The above‐mentioned properties are mathematically proved and demonstrated via simulation experiments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.