“…Due to its wide applications in various areas such as pattern classification, associative memory, parallel computation, optimization, moving object speed detection and so on, recurrent neural networks (RNNs) have been extensively studied by researchers in recent years (see, e.g., (Hopfield 1984;Hopfield and Tank 1986;Grujiá and Michel 1991;Matsouka 1992;Arik 2000;Ensari and Arik 2005;Zhang et al 2008;Wu et al 2008Wu et al , 2010Huang et al 2012;Huang and Feng 2009;Ahn 2010a;Liu and Cao 2010;Ahn 2010bAhn , c, 2011aAhn , b, 2012aSanchez and Perez 1999;Zhu and Shen 2012) and references therein). Since time-delay is unavoidably encountered in implementation of RNNs and is frequently a source of oscillation and instability, the stability of delayed neural networks has become a topic of great theoretical and practical importance, and many interesting results on stability in the Lyapunov sense have been derived (see also e.g.…”