2015
DOI: 10.1007/s11063-015-9486-6
|View full text |Cite
|
Sign up to set email alerts
|

Exponential stability of a class of competitive neural networks with multi-proportional delays

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
7
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 24 publications
(7 citation statements)
references
References 29 publications
0
7
0
Order By: Relevance
“…Obviously, conditions (A 1 ) and (A 3 ) imply conditions (H 1 ) and (H 2 ), respectively, and Theorem II.1 does not require condition (A 2 ). So, our Theorem II.1 greatly improves theorem III.1 in [30].…”
Section: The Existence and Uniqueness Of Equilibrium Pointmentioning
confidence: 68%
See 3 more Smart Citations
“…Obviously, conditions (A 1 ) and (A 3 ) imply conditions (H 1 ) and (H 2 ), respectively, and Theorem II.1 does not require condition (A 2 ). So, our Theorem II.1 greatly improves theorem III.1 in [30].…”
Section: The Existence and Uniqueness Of Equilibrium Pointmentioning
confidence: 68%
“…Therefore, to study various dynamical behaviours of CNNs with proportional delays is of important theoretical and practical significants. But until now, there is only one published work dealing with the exponential stability of competitive neural networks with proportional delays. In , the authors discussed the exponential stability of CNNs with multi‐proportional delays of the form rightSTM:εnormaldxifalse(tfalse)normaldt=aixifalse(tfalse)+truej=1nbijfjfalse(xjfalse(tfalse)false)2em2em+truej=1ncijfjfalse(xjfalse(qjtfalse)false)+Bitruej=1ndjmijfalse(tfalse)+Ii,LTM:normaldmijfalse(tfalse)normaldt=mijfalse(tfalse)+djfifalse(xifalse(tfalse)false), for t ≥ 1, i , j = 1,2,…, n , where x i ( t ) denotes the neuron current activity level; m i j is the synaptic efficiency; a i > 0 is the changing rate for neuron i ; b i j and c i j are constants which denote the strengths of connectivity between the cells j and i at time t and connection weights at time q j t respectively; d j is a given arbitrarily constant; q j is proportional delay factor and satisfies 0 < q j ≤ 1, q j t = t − (1 − q j ) t , in which (1 − q j ) t corresponds to the time delay function, and (1 − q j ) t → ∞ as q j ≠ 1, t → ∞ ; q=min1jnqj;Ii denotes the external input; B i > 0 is an external stimulus intensity; ε is a fast time scale decided by STM and ε > 0, f i (·) is the nonlinear activation function.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…It is worth mentioning that MCNNs with different time scales, which are extensions of conventional neural networks. It is a kind of unsupervised learning neural networks, which refers to the whole interconnection between input and output of the single layer neural networks [25]. MCNNs contain two types of state variables, including the aspects of long-term memory (LTM)and short-term memory(STM) [10], corresponding to the fast changes of the neural network states and the slow changes of the synapses by external stimuli, respectively.…”
mentioning
confidence: 99%