“…In the studies about global stability of neural networks models, discrete and continuous, it is usually assumed that the activation functions, f j , are Lipschitz [2,6,7,10,12,25,37]. Here we do not assume that f j are Lipschitz and hypothesis (H1) only implies the continuity of f j at u = 0.…”