The discovery of new and faster neural models is a significant and intriguing field of research in engineering and numerical linear algebra problems. Different nonlinear functions called activation functions (AFs) have been used for the acceleration of the convergence speed in each recurrent neural network (RNN) formula. In the context of this research manuscript a new dynamical system based on a novel odd-increasing nonlinear extended sign-bipower (Nesbp) AF is applied for the solution of the time-varying generalized Sylvester equation (TVGSE). It is a general equation which often appears in several research domains. Theoretical convergence analysis and numerical experiments in simulink will show the efficiency of the proposed formula and will amplify the fast convergence ability of the new function against previous AFs.