Abstract-We describe a neuromorphic chip with a twolayer excitatory-inhibitory recurrent network of spiking neurons that exhibits localized clusters of neural activity. Unlike other recurrent networks, the clusters in our network are pinned to certain locations due to transistor mismatch introduced in fabrication. As described in previous work, our pinned clusters respond selectively to oriented stimuli and the neurons' preferred orientations are distributed similar to the visual cortex. Here we show that orientation computation is rapid when activity alternates between layers (staccato-like), dislodging pinned clusters, which promotes fast cluster diffusion.
I. PATTERN-FORMING RECURRENT NETWORKSA 2-D recurrent network of spiking neurons with Mexican hat connectivity (local excitation and distal inhibition) can exhibit clusters of activity when the feedback is sufficiently strong. These clusters are an emergent property of the network and are identified by contiguous regions of activity surrounded by dead zones (no activity). In a homogeneous network (i.e., neurons and their connections are all identical), the locations of the clusters are unconstrained and have an equal likelihood of existing at any position. Therefore, clusters in a homogeneous network move in a random walk, constrained only by their interactions with nearby clusters [1].In contrast, networks with heterogeneous neurons tend to bias the locations where clusters reside. Clusters do not wander freely but are instead pinned to the locations that maximize their local recurrent feedback. One intriguing possibility is that the interactions between spatio-temporal input patterns (e.g., visual scenes) and these pinned clusters can process information. For example, it has been shown that oriented stimuli are able to shift the clusters away from their preferred locations to produce orientation selective responses whose distribution resembles cortical maps of preferred orientation (PO) [2], [3]. The seemingly complicated task of assigning similar POs to nearby neurons is cleverly achieved by simply building an imprecise, recurrent network.Transforming fixed-pattern noise into a smoothly changing feature map is an impressive feat, but this transformation is poorly understood. In particular, it is not known how cluster dynamics, which can range from fluid (mobile clusters) to crystalline (immobile clusters), influence PO map creation. We address this issue by characterizing how clusters diffuse over a range of network states and by examining the speed at which orientation maps converge for two disparate diffusion rates.Exploring a detailed, large-scale recurrent network over a wide range of network parameters is a computationally daunting task that is poorly suited for software modeling. We refer to circuit parameters as typepar, where type specifies the section of the neural circuit, and par specifies the circuit parameter (e.g., the excitatory cell synapse parameter A is referenced as E A ).Previous attempts to model such networks in software have sacrifi...