2023
DOI: 10.3390/fractalfract7060474
|View full text |Cite
|
Sign up to set email alerts
|

Dynamical Analysis of the Incommensurate Fractional-Order Hopfield Neural Network System and Its Digital Circuit Realization

Abstract: Dynamical analysis of the incommensurate fractional-order neural network is a novel topic in the field of chaos research. This article investigates a Hopfield neural network (HNN) system in view of incommensurate fractional orders. Using the Adomian decomposition method (ADM) algorithm, the solution of the incommensurate fractional-order Hopfield neural network (FOHNN) system is solved. The equilibrium point of the system is discussed, and the dissipative characteristics are verified and discussed. By varying … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 50 publications
0
6
0
Order By: Relevance
“…Among the basic parameters, α represents the weight of the feedback signal received by the current neuron from other neurons, which is essentially used to balance the strength of chaotic dynamics and gradient descent [1][2][3][4]. z(0) and β mainly determine the initial value and decay rate of the self-feedback term (chaos term), respectively [1][2][3][4][5][6][7][8]. For the adjustment mechanism of α, z(0), and β, it is usually expected that the early stage is dominated by chaos search (improving the global search ability, avoiding local optimization), and the late stage is dominated by gradient convergence (improving the convergence speed).…”
Section: Mfcscnn Modelmentioning
confidence: 99%
See 4 more Smart Citations
“…Among the basic parameters, α represents the weight of the feedback signal received by the current neuron from other neurons, which is essentially used to balance the strength of chaotic dynamics and gradient descent [1][2][3][4]. z(0) and β mainly determine the initial value and decay rate of the self-feedback term (chaos term), respectively [1][2][3][4][5][6][7][8]. For the adjustment mechanism of α, z(0), and β, it is usually expected that the early stage is dominated by chaos search (improving the global search ability, avoiding local optimization), and the late stage is dominated by gradient convergence (improving the convergence speed).…”
Section: Mfcscnn Modelmentioning
confidence: 99%
“…Appropriate parameter selection of the model determines whether the algorithm can find the optimal solution quickly and accurately, which is always the difficulty faced by the TCNN class optimization model. Therefore, in order to facilitate rapid, effective and clear research and the use of appropriate parameter settings, this paper summarizes all parameter settings and selection guidance by referring to many literatures [1][2][3][4][5][6][7][8][9][10][17][18][19] and the above experimental analysis and verification, as shown in Table 2 To obtain good optimization performance of the model, it is necessary to properly select and balance the relationship between the basic parameters of the model and the MFCS parameter settings. The higher the complexity of the optimization problem, the stronger the non-monotony of MFCS is required.…”
Section: Mfcscnn Modelmentioning
confidence: 99%
See 3 more Smart Citations