2012
DOI: 10.1007/s11063-012-9241-1
|View full text |Cite
|
Sign up to set email alerts
|

Accelerating a Recurrent Neural Network to Finite-Time Convergence for Solving Time-Varying Sylvester Equation by Using a Sign-Bi-power Activation Function

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
73
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 315 publications
(73 citation statements)
references
References 28 publications
0
73
0
Order By: Relevance
“…These problems were observed and further studied by Dieci and Eirola in 1999 [5] and more recently by Sirković and Kressner [15] in 2016 and they likewise occur for a related parameter-varying matrix eigen problem of Loisel and Maxwell in 2018 [10]. Our ZNN eigenvalue algorithm is impervious to these restrictions as it handles repeated eigenvalues in symmetric matrix flows A(t) without any problems, see Figures 1,5,and 8. The new ZD model of this paper is a discrete dynamical system with the potential for practical real-time and on-chip implementation and computer simulations. We include results of computer simulations and numerical experiments that illustrate the usefulness and efficiency of our real-time discrete ZD matrix eigenvalue algorithm, both for smooth data inputs and also for piecewise smooth time-varying matrix flows A(t) that might occur naturally when sensors fail or data lines get disrupted in the field.…”
Section: Introductionmentioning
confidence: 78%
See 1 more Smart Citation
“…These problems were observed and further studied by Dieci and Eirola in 1999 [5] and more recently by Sirković and Kressner [15] in 2016 and they likewise occur for a related parameter-varying matrix eigen problem of Loisel and Maxwell in 2018 [10]. Our ZNN eigenvalue algorithm is impervious to these restrictions as it handles repeated eigenvalues in symmetric matrix flows A(t) without any problems, see Figures 1,5,and 8. The new ZD model of this paper is a discrete dynamical system with the potential for practical real-time and on-chip implementation and computer simulations. We include results of computer simulations and numerical experiments that illustrate the usefulness and efficiency of our real-time discrete ZD matrix eigenvalue algorithm, both for smooth data inputs and also for piecewise smooth time-varying matrix flows A(t) that might occur naturally when sensors fail or data lines get disrupted in the field.…”
Section: Introductionmentioning
confidence: 78%
“…ZD methods are specifically designed for time-varying problems and have proven most efficient there. They use 1st-order time derivatives and have been applied successfully to -for example -solve time-varying Sylvester equations [22,9,8], to find time-varying matrix inverses [21,7,3,12] (see also [4]), and to optimize time-varying matrix minimization problems [26,11], all in real-time. These algorithms generally use both, 1-step ahead and backward differentiation formulas and run in discrete time with high accuracy.…”
Section: Introductionmentioning
confidence: 99%
“…In this paper, we will combine an activation function presented in 2013 [8] and referred to Li activation function to the GNN model (2). Both theoretical analysis and numerical simulation show that when using GNN model (2) …”
Section: Gradient-based Neural Network (Gnn) Modelmentioning
confidence: 99%
“…In this paper, the improved GNN model is presented with a new activation function suggested in [8], referred to Li activation function, for solving the Lyapunov matrix equation. The global convergence and finite-time convergence are proved in theory.…”
Section: Introductionmentioning
confidence: 99%
“…They are widely applied in scientific and engineering field, for example, optimization (Smith (1999); Li, Lou and Liu (2012)), control of chaos (Lin, Li and Liu (2012)), pattern classification (Burrows and Niranjan (1994); Husken and Stagge (2003)), signal processing (Skowronski and Harris (2007)), robotics (Li, Chen, Liu, Li and Liang (2012)), solving time-varying Sylvester equation (Li, Chen and Liu (2013)), the winners-take-all competition ; Liu and Wang (2008); ), convex quadratic programming ( Zhang and Wang (2002); Xia and Sun (2009) ;Xia Feng and Wang (2004)), kinematic control of redundant manipulators ( Zhang, Wang and Xia (2003)) etc. Particularly, the Hopfield neural networks (Hopfield (1984)), the recurrent neural networks, can be used to online optimization.…”
Section: Introductionmentioning
confidence: 99%