DOI: 10.4028/www.scientific.net/amm.511-512.875
View full text

Abstract: The H{infinity} filter design problem of recurrent neural networks with time delay is considered. Based on delay decomposition approach, the delay-dependent condition is derived to ensure that the filtering error system is globally asymptotically stable with a guaranteed performance. And the design of such a filter can be solved by the linear matrix inequality. A numerical example is provided to demonstrate that the developed approach is efficient.