2008
DOI: 10.1016/j.nonrwa.2007.03.018
|View full text |Cite
|
Sign up to set email alerts
|

Global exponential stability in Lagrange sense for recurrent neural networks with time delays

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

2
50
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 93 publications
(52 citation statements)
references
References 34 publications
(54 reference statements)
2
50
0
Order By: Relevance
“…Definition 1 (see [57]). The neural network (2) is said to be uniformly stable in Lagrange sense, if, for any > 0, there is a positive constant = ( ) > 0 such that ‖ ( , )‖ < for any…”
Section: Assumption (A)mentioning
confidence: 99%
See 2 more Smart Citations
“…Definition 1 (see [57]). The neural network (2) is said to be uniformly stable in Lagrange sense, if, for any > 0, there is a positive constant = ( ) > 0 such that ‖ ( , )‖ < for any…”
Section: Assumption (A)mentioning
confidence: 99%
“…Definition 2 (see [57]). If there exists a radially unbounded and positive definite function (⋅), a nonnegative continuous function (⋅), and two positive constants and such that, for any solution ( ) of neural network (2), ( ( )) > implies ( ( )) − ≤ ( ) − for any ≥ 0 and ∈ , then the neural network (2) is said to be globally exponentially attractive (GEA) with respect to ( ( )), and the compact set Ω = { ( ) ∈ R : ( ( )) ≤ } is said to be a GEA set of (2).…”
Section: Assumption (A)mentioning
confidence: 99%
See 1 more Smart Citation
“…It is worth to mention that unlike Lyapunov stability, Lagrange stability refers to the stability of the total system, rather than the stability of the equilibriums, because the Lagrange stability is considered on the basis of the boundedness of solutions, which depend on the existence of global attractive sets (see [12,18,19,[21][22][23][24][25]). We also note that Lagrange stability has attracted phenomenal worldwide attention.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, the study of Lagrange stability of neural networks with delays is practically required, and it has been extensively studied. For example, in [18,19], Liao et al apply Lyapunov functions to study Lagrange stability for recurrent neural networks with constant time delays and time-varying delays. In [20], Yang and Cao consider stability in Lagrange sense of a class of feedback neural networks for optimization problems.…”
Section: Introductionmentioning
confidence: 99%