2015
DOI: 10.1007/s11071-015-2279-7
|View full text |Cite
|
Sign up to set email alerts
|

Design of modified fractional adaptive strategies for Hammerstein nonlinear control autoregressive systems

Abstract: In this study, modified fractional least mean square (FrLMS) algorithms are formulated for parameter estimation of Hammerstein nonlinear control autoregressive system (HNCAR) by exploiting the fractional calculus concepts in weight adaptation mechanism of the algorithm. In modified FrLMS (MFrLMS) of first kind, forgetting factor is applied to exploit the strength of both standard LMS and FrLMS algorithms. The MFrLMS algorithm of second kind is based on single fractional weight adaptation term in cost function … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 40 publications
(13 citation statements)
references
References 47 publications
0
12
0
Order By: Relevance
“…Minimizing the objective function () by taking first‐order derivative and fractional derivative with respect to wtrue^k$$ {\hat{w}}_k $$, 37,45 and combined with the negative gradient search, the conventional fractional least mean square (FLMS) algorithm is expressed as: wtrue^k(n)goodbreak=wtrue^k(ngoodbreak−1)goodbreak−μ2J(n)wtrue^kgoodbreak−μitalicfr2frJ(n)wtrue^kfr,$$ {\hat{w}}_k(n)={\hat{w}}_k\left(n-1\right)-\frac{\mu }{2}\frac{\partial J(n)}{\partial {\hat{w}}_k}-\frac{\mu_{fr}}{2}\frac{\partial^{fr}J(n)}{\partial {\hat{w}}_k^{fr}}, $$ where μ$$ \mu $$ and μfr$$ {\mu}_{fr} $$ are the step size parameter of the filter corresponding to first‐order derivative and fractional derivative of objective function, k=0,1,,M1$$ k=0,1,\dots, M-1 $$.…”
Section: Proposed Auxiliary Model Based Normalized Fractional Adaptiv...mentioning
confidence: 99%
“…Minimizing the objective function () by taking first‐order derivative and fractional derivative with respect to wtrue^k$$ {\hat{w}}_k $$, 37,45 and combined with the negative gradient search, the conventional fractional least mean square (FLMS) algorithm is expressed as: wtrue^k(n)goodbreak=wtrue^k(ngoodbreak−1)goodbreak−μ2J(n)wtrue^kgoodbreak−μitalicfr2frJ(n)wtrue^kfr,$$ {\hat{w}}_k(n)={\hat{w}}_k\left(n-1\right)-\frac{\mu }{2}\frac{\partial J(n)}{\partial {\hat{w}}_k}-\frac{\mu_{fr}}{2}\frac{\partial^{fr}J(n)}{\partial {\hat{w}}_k^{fr}}, $$ where μ$$ \mu $$ and μfr$$ {\mu}_{fr} $$ are the step size parameter of the filter corresponding to first‐order derivative and fractional derivative of objective function, k=0,1,,M1$$ k=0,1,\dots, M-1 $$.…”
Section: Proposed Auxiliary Model Based Normalized Fractional Adaptiv...mentioning
confidence: 99%
“…Ortiguierra and his team introduced the fundamental concept and theory of F-SP [30,31]. Afterwards, these concepts have been applied effectively to solve many problems including system identification [44][45][46][47], speech enhancement [48], active noise control [35] and channel equalisation [36]. The aim of present study is to step further in this domain by proposing fractional variant of VLMS algorithm for parameter estimation of H-BJ system.…”
Section: Fractional Volterra Lmsmentioning
confidence: 99%
“…To the best of our knowledge, different fractional-order gradient methods have been produced [34][35][36]. For example, in [37], a fractional-order SG algorithm was designed to identify the Hammerstein nonlinear ARMAX systems by an improved fractional-order gradient method.…”
Section: Introductionmentioning
confidence: 99%