Phase transitions and critical behavior are crucial issues both in theoretical and experimental neuroscience. We report analytic and computational results about phase transitions and self-organized criticality (SOC) in networks with general stochastic neurons. The stochastic neuron has a firing probability given by a smooth monotonic function Φ(V) of the membrane potential V, rather than a sharp firing threshold. We find that such networks can operate in several dynamic regimes (phases) depending on the average synaptic weight and the shape of the firing function Φ. In particular, we encounter both continuous and discontinuous phase transitions to absorbing states. At the continuous transition critical boundary, neuronal avalanches occur whose distributions of size and duration are given by power laws, as observed in biological neural networks. We also propose and test a new mechanism to produce SOC: the use of dynamic neuronal gains – a form of short-term plasticity probably located at the axon initial segment (AIS) – instead of depressing synapses at the dendrites (as previously studied in the literature). The new self-organization mechanism produces a slightly supercritical state, that we called SOSC, in accord to some intuitions of Alan Turing.
We describe the statistics of repetition times of a string of symbols in a stochastic process.Denote by τA the time elapsed until the process spells the finite string A and by SA the number of consecutive repetitions of A. We prove that, if the length of the string grows unbondedly, (1) the distribution of τA, when the process starts with A, is well approximated by a certain mixture of the point measure at the origin and an exponential law, and (2) SA is approximately geometrically distributed. We provide sharp error terms for each of these approximations. The errors we obtain are point-wise and allow to get also approximations for all the moments of τA and SA. To obtain (1) we assume that the process is φ-mixing while to obtain (2) we assume the convergence of certain contidional probabilities.
We prove that for any α-mixing stationnary process the hitting time of any n-string An converges, when suitably normalized, to an exponential law. We identify the normalization constant λ(An). A similar statement holds also for the return time.To establish this result we prove two other results of independent interest. First, we show a relation between the rescaled hitting time and the rescaled return time, generalizing a theorem by Haydn, Lacroix and Vaienti. Second, we show that for positive entropy systems, the probability of observing any n-string in n consecutive observations, goes to zero as n goes to infinity.
We study the distribution of the occurrence of rare patterns in sufficiently mixing Gibbs random fields on the lattice Z d , d ≥ 2. A typical example is the high temperature Ising model. This distribution is shown to converge to an exponential law as the size of the pattern diverges. Our analysis not only provides this convergence but also establishes a precise estimate of the distance between the exponential law and the distribution of the occurrence of finite patterns. A similar result holds for the repetition of a rare pattern. We apply these results to the fluctuation properties of occurrence and repetition of patterns: We prove a central limit theorem and a large deviation principle.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.