2003
DOI: 10.1103/physreve.67.061902
|View full text |Cite
|
Sign up to set email alerts
|

Stability analysis of a delayed Hopfield neural network

Abstract: In this paper, we study a class of neural networks, which includes bidirectional associative memory networks and cellular neural networks as its special cases. By Brouwer's fixed point theorem, a continuation theorem based on Gains and Mawhin's coincidence degree, matrix theory, and inequality analysis, we not only obtain some different sufficient conditions ensuring the existence, uniqueness, and global exponential stability of the equilibrium but also estimate the exponentially convergent rate. Our results a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
25
1

Year Published

2004
2004
2012
2012

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 48 publications
(27 citation statements)
references
References 18 publications
(2 reference statements)
1
25
1
Order By: Relevance
“…In Theorem 4.1 and Corollary 4.1, we do not need the assumptions of boundedness, monotonicity, and differentiability for the activation functions; moreover, the model discussed is with continuously distributed delays. Clearly, the proposed results are different from those in [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17] and the references cited therein. Therefore, the results of this paper are new and they complement previously known results.…”
Section: Thencontrasting
confidence: 62%
See 1 more Smart Citation
“…In Theorem 4.1 and Corollary 4.1, we do not need the assumptions of boundedness, monotonicity, and differentiability for the activation functions; moreover, the model discussed is with continuously distributed delays. Clearly, the proposed results are different from those in [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17] and the references cited therein. Therefore, the results of this paper are new and they complement previously known results.…”
Section: Thencontrasting
confidence: 62%
“…Many good results have already been obtained by some authors in [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17] and the references cited therein. Moreover, the existing results are based on the assumption that demand either the activation functions or delays are bounded in the above-mentioned literature.…”
Section: Introductionmentioning
confidence: 64%
“…Some other models, such as continuous BAM (bidirectional associative memory) networks, cellular neural networks and Hopfield-type neural networks, are special cases of the network model (1.1) (see, for instance [1][2][3][4]). Obviously, it is also a generalization of the following CGNNs model where c ij are constants and a i (AE), b i (AE), x i (t), f j (AE) are the same as those in (1.1), i, j = 1,2,. .…”
Section: Introductionmentioning
confidence: 99%
“…The first two conditions can guarantee a network to be convergent with a prescribed exponential decay rate and trajectory bounds, described respectively by σ and α, β, while the last two only ensure exponential convergence in a network, saying nothing about nothing decay rate explicitly (condition (18) also provides an estimate of the trajectory bound). On the other hand, it should be noted that conditions (18) and (19) are delay independent. This is of practical significance in the case where time delays exist but their magnitudes could not be evaluated accurately.…”
Section: η) ≥ĩ(T)mentioning
confidence: 99%
“…In this paper, we do not require the activation functions to be bounded, differentiable and global Lipschitz continuous; also we do not assume that the considered model has any equilibriums. Specially, we give conditions on the global exponential stability and the existence of the periodic solution of delayed neural networks by the method using in [10][11][12][13]18]. In addition, one example is given to illustrate the results.…”
Section: Introductionmentioning
confidence: 99%