2016
DOI: 10.1016/j.cam.2016.03.006
|View full text |Cite
|
Sign up to set email alerts
|

A novel summation inequality for stability analysis of discrete-time neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
14
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 25 publications
(14 citation statements)
references
References 21 publications
0
14
0
Order By: Relevance
“…Therefore, the proposed method largely reduces the computational burden. In addition, we have modified the Lyapunov function by including the leakage terms false(ie,1ptϱi=kϱk1μTfalse(ifalse)Z1μfalse(ifalse) and ρj=kρk1νTfalse(jfalse)Z2νfalse(jfalse)) and proved that it is less conserved than the stability conditions obtained in the work of Liu et al (see Corollary ).…”
Section: Asymptotic Stability Resultsmentioning
confidence: 98%
“…Therefore, the proposed method largely reduces the computational burden. In addition, we have modified the Lyapunov function by including the leakage terms false(ie,1ptϱi=kϱk1μTfalse(ifalse)Z1μfalse(ifalse) and ρj=kρk1νTfalse(jfalse)Z2νfalse(jfalse)) and proved that it is less conserved than the stability conditions obtained in the work of Liu et al (see Corollary ).…”
Section: Asymptotic Stability Resultsmentioning
confidence: 98%
“…As a result of applying the mentioned techniques, some less conservative results are derived in [56]. As a result of applying the mentioned techniques, it is shown that the derived results are less conservative then the existing ones [24,31,57,58].…”
Section: Introductionmentioning
confidence: 98%
“…Since most systems use a digital processor to acquire information from computers at discrete instants of time, it is essential to formulate discrete-time neural networks (DNNs) that are an analogue of continuous ones [11][12][13][14][15][16][17][18]. In order to improve results regarding this problem, various techniques have been applied to the delay-dependent category, such as augmented Lyapunov-Krasovskii (LK) functional [13,[19][20][21][22], free-weighting matrix method [18,23], summation inequality method [16,[24][25][26][27], delay-partitioning method [5,28,29] and reciprocally convex approach [20,30,31].…”
Section: Introductionmentioning
confidence: 99%
“…The authors in [25] proposed the concept of novel summation inequalities with time-varying delays to check the stability of neural networks. The results on Wirtinger based inequality has been proved by authors [26,27]. Liu in [28,29] investigated the impact of stability in exponential sense for discrete time BAM neural networks with some novel criterions and in [30] authors proposed discrete-time BAM neural networks in the sense of exponential.…”
Section: Introductionmentioning
confidence: 99%