The nervous system represents time dependent signals in sequences of discrete, identical action potentials or spikes; information is carried only in the spike arrival times. We show how to quantify this information, in bits, free from any assumptions about which features of the spike train or input signal are most important, and we apply this approach to the analysis of experiments on a motion sensitive neuron in the fly visual system. This neuron transmits information about the visual stimulus at rates of up to 90 bits͞s, within a factor of 2 of the physical limit set by the entropy of the spike train itself.
To provide information about dynamic sensory stimuli, the pattern of action potentials in spiking neurons must be variable. To ensure reliability these variations must be related, reproducibly, to the stimulus. For H1, a motion-sensitive neuron in the fly's visual system, constant-velocity motion produces irregular spike firing patterns, and spike counts typically have a variance comparable to the mean, for cells in the mammalian cortex. But more natural, time-dependent input signals yield patterns of spikes that are much more reproducible, both in terms of timing and of counting precision. Variability and reproducibility are quantified with ideas from information theory, and measured spike sequences in H1 carry more than twice the amount of information they would if they followed the variance-mean relation seen with constant inputs. Thus, models that may accurately account for the neural response to static stimuli can significantly underestimate the reliability of signal transfer under more natural conditions.
A quantitative analysis of a recent model of high-temperature superconductors based on an interlayer tunneling mechanism is presented. This model can account well for the observed magnitudes of the high transition temperatures in these materials and implies a gap that does not change sign, can be substantially anisotropic, and has the same symmetry as the crystal. The experimental consequences explored so far are consistent with the observations.
We show that the information carried by compound events in neural spike trains-patterns of spikes across time or across a population of cells-can be measured, independent of assumptions about what these patterns might represent. By comparing the information carried by a compound pattern with the information carried independently by its parts, we directly measure the synergy among these parts. We illustrate the use of these methods by applying them to experiments on the motion-sensitive neuron H1 of the fly's visual system, where we confirm that two spikes close together in time carry far more than twice the information carried by a single spike. We analyze the sources of this synergy and provide evidence that pairs of spikes close together in time may be especially important patterns in the code of H1.
Imagine being shown N samples of random variables drawn independently from the same distribution. What can you say about the distribution? In general, of course, the answer is nothing, unless you have some prior notions about what to expect. From a Bayesian point of view one needs an a priori distribution on the space of possible probability distributions, which defines a scalar field theory. In one dimension, free field theory with a normalization constraint provides a tractable formulation of the problem, and we discuss generalizations to higher dimensions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.