2020 Innovations in Intelligent Systems and Applications Conference (ASYU) 2020
DOI: 10.1109/asyu50717.2020.9259812
|View full text |Cite
|
Sign up to set email alerts
|

Extended Kalman Filter Based Modified Elman-Jordan Neural Network for Control and Identification of Nonlinear Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 17 publications
0
5
0
Order By: Relevance
“…Where, 𝐼 𝑖 , 𝐹𝑂𝑃𝐼𝐷, 𝐻 𝑗 , ℎ 𝑘 , 𝑂 𝑙 , 𝐶 𝑐 , 𝐽 𝑚 , 𝑆 𝑟 , 𝑊 𝑗𝑖 , 𝑉 𝑘𝑗 , 𝑍 𝑙𝑘 , 𝑄 𝑗𝑐 , 𝑅 𝑘𝑐 , 𝐸 𝑘𝑟 , 𝑓 𝑗 , 𝑔 𝑘 , 𝛼 , 𝛽, 𝜂 , 𝑎𝑛𝑑 𝑠𝑡: layer1, layer2, layer3, layer4, layer5, layer6, layer7, layer8, the weight between layers (2&3), the weight between layer(3&4), the weight between layer(4&5), the weight between layer(3&6), the weight between layer(3&7), the weight between layer(4&8), the sigmoid activation function, the feedback gain to the self-connection of layer (6&7&8), and step size respectively [28]. A FOPPIDNNC4 is shown in Fig.…”
Section: Fractional Order Proportional Integral Derivative Neural Net...mentioning
confidence: 99%
“…Where, 𝐼 𝑖 , 𝐹𝑂𝑃𝐼𝐷, 𝐻 𝑗 , ℎ 𝑘 , 𝑂 𝑙 , 𝐶 𝑐 , 𝐽 𝑚 , 𝑆 𝑟 , 𝑊 𝑗𝑖 , 𝑉 𝑘𝑗 , 𝑍 𝑙𝑘 , 𝑄 𝑗𝑐 , 𝑅 𝑘𝑐 , 𝐸 𝑘𝑟 , 𝑓 𝑗 , 𝑔 𝑘 , 𝛼 , 𝛽, 𝜂 , 𝑎𝑛𝑑 𝑠𝑡: layer1, layer2, layer3, layer4, layer5, layer6, layer7, layer8, the weight between layers (2&3), the weight between layer(3&4), the weight between layer(4&5), the weight between layer(3&6), the weight between layer(3&7), the weight between layer(4&8), the sigmoid activation function, the feedback gain to the self-connection of layer (6&7&8), and step size respectively [28]. A FOPPIDNNC4 is shown in Fig.…”
Section: Fractional Order Proportional Integral Derivative Neural Net...mentioning
confidence: 99%
“…Therefore, according to the error positive feedback correction of reverse transmission, the error is continuously controlled within the adaptive threshold, and satisfactory results can be obtained. Under artificial conditions, the neural network cannot rely on a thin database to complete, so the limitation is that a large number of data must be mastered and collected to complete the operating conditions of BP neural network modeling [24,[6][7].…”
Section: Back Propagation Neural Networkmentioning
confidence: 99%
“…where Λ c is the constant with respect to Λ . Let γ λ   , and utilizing (24) in (22), this paper obtains…”
Section: The Pgaf With Variable Step Sizementioning
confidence: 99%
See 2 more Smart Citations