Presents a novel synthesis procedure to realize an associative memory using the Generalized-Brain-State-in-a-Box (GBSB) neural model. The implementation yields an interconnection structure that guarantees that the desired memory patterns are stored as asymptotically stable equilibrium points and that possesses very few spurious states. Furthermore, the interconnection structure is in general non-symmetric. Simulation examples are given to illustrate the effectiveness of the proposed synthesis method. The results obtained for the GBSB model are successfully applied to other neural network models.
Deals with the use of neural networks to solve linear and nonlinear programming problems. The dynamics of these networks are analyzed. In particular, the dynamics of the canonical nonlinear programming circuit are analyzed. The circuit is shown to be a gradient system that seeks to minimize an unconstrained energy function that can be viewed as a penalty method approximation of the original problem. Next, the implementations that correspond to the dynamical canonical nonlinear programming circuit are examined. It is shown that the energy function that the system seeks to minimize is different than that of the canonical circuit, due to the saturation limits of op-amps in the circuit. It is also noted that this difference can cause the circuit to converge to a different state than the dynamical canonical circuit. To remedy this problem, a new circuit implementation is proposed.
This paper is concerned with utilizing neural networks and analog circuits to solve constrained optimization problems. A novel neural network architecture is proposed for solving a class of nonlinear programming problems. The proposed neural network, or more precisely a physically realizable approximation, is then used to solve minimum norm problems subject to linear constraints. Minimum norm problems have many applications in various areas, but we focus on their applications to the control of discrete dynamic processes.The applicability of the proposed neural network is demonstrated on numerical examples.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.