We study the computational capacity of a model neuron, the Tempotron, which classifies sequences of spikes by linear-threshold operations. We use statistical mechanics and extreme value theory to derive the capacity of the system in random classification tasks. In contrast to its static analog, the Perceptron, the Tempotron's solutions space consists of a large number of small clusters of weight vectors. The capacity of the system per synapse is finite in the large size limit and weakly diverges with the stimulus duration relative to the membrane and synaptic time constants.PACS numbers: 87.18. Sn, 87.19.ll, 86.19.lv Neural network models of supervised learning are usually concerned with processing static spatial patterns of intensities. A famous example is learning in a singlelayer binary neuron, the Perceptron [1, 2]. However, in most neuronal systems, neural activities are in the form of time series of spikes. Furthermore, stimulus representation in some sensory systems are characterized by a small number of precisely timed spikes [3,4], suggesting that the brain possesses a machinery for extracting information embedded in the timings of spikes, not only in their overall rate. Thus, understanding the power and limitations of spike-timing based computation and learning is of fundamental importance in computational neuroscience. Gütig and Sompolinsky [5] have recently suggested a simple model, the Tempotron, for decoding information embedded in spatio-temporal spike patterns. The Tempotron is an Integrate and Fire (IF) neuron, with N input synapses of strength ω i , i = 1, . . . , N . Each input pattern is represented by N sequences of spikes, where the spike timings for the afferent i are denoted by {t i }. The membrane potential is given bywhere u(t) denotes a fixed causal temporal kernel. An example is the difference of exponentials form:, where τ m and τ s correspond, respectively, to the membrane and synaptic time constants [6]. The Tempotron fires a spike whenever U crosses the threshold, U th , from below [7] (Fig. 1a). The Tempotron performs a binary classification of its input patterns by firing one or more output spikes when presented with a 'target' (+1) pattern and remaining quiescent during a 'null' (-1) pattern.In this Letter we present a theoretical study of the computational power of the Tempotron. We focus on the standard task of classifying a batch of P = αN random patterns, where α denotes the number of patterns per input synapse. For each pattern, the timings of the input spikes from each input neuron are randomly chosen from independent Poisson processes with rate 1 T , where T is the duration of the input patterns, and the desired output, y = ±1, is randomly and independently chosen with equal probabilities. A solution to the classification problem is a set of synaptic weights {ω i } that yields a correct classification of all P patterns. We will address several fundamental questions. First, numerical simulations based on a simple error-correcting on-line learning algorithm suggest that...