2021
DOI: 10.1186/s13662-021-03586-4
|View full text |Cite
|
Sign up to set email alerts
|

Mean-square exponential input-to-state stability of stochastic inertial neural networks

Abstract: By introducing some parameters perturbed by white noises, we propose a class of stochastic inertial neural networks in random environments. Constructing two Lyapunov–Krasovskii functionals, we establish the mean-square exponential input-to-state stability on the addressed model, which generalizes and refines the recent results. In addition, an example with numerical simulation is carried out to support the theoretical findings.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 42 publications
1
2
0
Order By: Relevance
“…Obviously, Assumption 1, local Lipschitz condition, is weaker than uniform Lipschitz one given in [4–29]. Hence, our results have improved and generalized some existed ones of [4–29]. Moreover, we shall show that the main result of [19] is a particular case of this paper.…”
Section: Resultssupporting
confidence: 51%
See 2 more Smart Citations
“…Obviously, Assumption 1, local Lipschitz condition, is weaker than uniform Lipschitz one given in [4–29]. Hence, our results have improved and generalized some existed ones of [4–29]. Moreover, we shall show that the main result of [19] is a particular case of this paper.…”
Section: Resultssupporting
confidence: 51%
“…To our knowledge, rare researchers have considered stochastic delayed recurrent neural networks without uniform Lipschitz condition. Obviously, Assumption 1, local Lipschitz condition, is weaker than uniform Lipschitz one given in [4–29]. Hence, our results have improved and generalized some existed ones of [4–29].…”
Section: Resultsmentioning
confidence: 52%
See 1 more Smart Citation