An extended Wang neural network (EWNN) is proposed to solve online a set of linear equations. Such EWNN is possessing the general nonlinear model form with redundant parts, to face existence of nonlinearity phenomena in circuit implementation of Wang neural network (WNN). Furthermore, two types of nonlinear activation are proposed for EWNN aiming to improve the convergence of the WNN. Illustrative results verify the proposed EWNN for online solving linear equations.Introduction: The problem of finding solutions of linear equations can be regarded as one fundamental issue in many engineering applications. Through large-scale analogue or digital circuits implementation, the neural network is able to solve online linear equations in a parallelprocessing manner to remedy the inefficiency of conventional numerical approaches. Wang had proposed the recurrent neural network for online solving the linear equations elegantly with a preliminary circuit implementation [1]. Raida [2] made effort to enhance the performance of such Wang neural network (WNN) by using an improved circuit schematic. Zhang and Chen [3] analysed the global exponential convergence and stability of the WNN. Chen proved the robustness of the WNN for solving linear equations with implementation noises and analysed the upper boundary of the steady-state solution error [4].
Motivation:The previous WNN was designed based on the gradientdescent approach and depicted in a linear form. However, in real circuit implementation, nonlinear phenomenon can exist in the amplifiers (which may correspond the activation-function parts of the neural network) thus the WNN is required to work under such nonlinear situation. Moreover, the neural circuit may completely break down if no redundant parts are obligated. To overcome aforementioned unexpected circumstances, an extended WNN (EWNN) is proposed to solve a set of linear equations in the Letter. By involving the ensuing proposed nonlinear activation functions, the EWNN can possess superior convergence so as to improve the convergence of the previous WNN.