2012
DOI: 10.1049/el.2012.1940
|View full text |Cite
|
Sign up to set email alerts
|

Robustness analysis of Wang neural network for online linear equation solving

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
12
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(12 citation statements)
references
References 4 publications
0
12
0
Order By: Relevance
“…In general, the minimal arithmetic operations of numerical algorithms are usually proportional to the cube of the dimension of the coefficient matrix, that is, ( 3 ) [7]. In order to be satisfied with the low complexity and real-time requirements, recently, numerous novel neural networks have been exploited based on the hardware implementation [2,4,5,[8][9][10][11][12][13]. For example, Tank and Hopfield solved the linear programming problems by using their proposed Hopfield neural networks (HNN) [9], which promoted the development of the neural networks in the optimization and other application problems.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In general, the minimal arithmetic operations of numerical algorithms are usually proportional to the cube of the dimension of the coefficient matrix, that is, ( 3 ) [7]. In order to be satisfied with the low complexity and real-time requirements, recently, numerous novel neural networks have been exploited based on the hardware implementation [2,4,5,[8][9][10][11][12][13]. For example, Tank and Hopfield solved the linear programming problems by using their proposed Hopfield neural networks (HNN) [9], which promoted the development of the neural networks in the optimization and other application problems.…”
Section: Introductionmentioning
confidence: 99%
“…In this paper, based on the Wang neural networks [10], we present an improved gradient-based neural model for the linear simultaneous equation, and then, such neural model is applied to solve the quadratic programming with equalityconstraints. Much investigation and analysis on the Wang neural network have been presented in the previous work [10,12,13]. To make full use of the Wang neural network, we transform the convex quadratic programming into the general linear matrix-equation.…”
Section: Introductionmentioning
confidence: 99%
“…where x(t) ∈ R n is the neural state vector and γ is the design parameter which scales the convergence rate. According to [3,4], WNN (1) is with global (convergence) stability and robustness for disturbance noises. In this Letter, we propose the following EWNN for solving Ax = b:…”
mentioning
confidence: 99%
“…From derivation above, we can see thatv(t) ≤ 0 is negative definite. According to the well-known Lyapunov theory [4], Ax(t) can globally converge to zero. This implies that the neural state x(t) can globally converge to the theoretical solution x* of linear equations since A is nonsingular.…”
mentioning
confidence: 99%
See 1 more Smart Citation