2006
DOI: 10.1016/j.camwa.2005.10.009
|View full text |Cite
|
Sign up to set email alerts
|

Exponential periodicity and stability of neural networks with reaction-diffusion terms and both variable and unbounded delays

Abstract: In this paper, the exponential periodicity and stability of neural networks with Lipschitz continuous activation functions are investigated, without assuming the boundedness of the activation functions and the differentiability of time-varying delays, as needed in most other papers. The neural networks contain reaction-diffusion terms and both variable and unbounded delays. Some sufficient conditions ensuring the existence and uniqueness of periodic solution and stability of neural networks with reaction-diffu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
18
0

Year Published

2007
2007
2019
2019

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 35 publications
(19 citation statements)
references
References 34 publications
(46 reference statements)
1
18
0
Order By: Relevance
“…It is not difficult to discover that Theorem 1 in [29] is involved in our Corollary 2. However, since there exist reaction-diffusion terms Corollary 4 in this paper is not as good as Theorem 1 in [27].…”
Section: Remarks and Examplesmentioning
confidence: 99%
See 3 more Smart Citations
“…It is not difficult to discover that Theorem 1 in [29] is involved in our Corollary 2. However, since there exist reaction-diffusion terms Corollary 4 in this paper is not as good as Theorem 1 in [27].…”
Section: Remarks and Examplesmentioning
confidence: 99%
“…A more appropriate and ideal way is to incorporate finite delays and infinite delays, e.g.,Refs. [27]- [29]. However, strictly speaking, diffusion effects cannot be avoided in the neural networks when electrons are moving in asymmetric electromagnetic fields.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Hence, it is essential to consider the state variables varying with time and space. The neural networks with diffusion terms can commonly be expressed by partial differential equations [10][11][12][13][14][15] have considered the stability of neural networks with diffusion terms, in which boundary conditions are all the Neumann boundary conditions. The neural networks model with Dirichlet boundary conditions has been considered in [16,21], but it concentrated on deterministic systems and did not take random perturbation into consideration.…”
Section: Introductionmentioning
confidence: 99%