Accurate population models are needed to build very large-scale neural models, but their derivation is difficult for realistic networks of neurons, in particular when nonlinear properties are involved, such as conductance-based interactions and spike-frequency adaptation. Here, we consider such models based on networks of adaptive exponential integrate-and-fire excitatory and inhibitory neurons. Using a master equation formalism, we derive a mean-field model of such networks and compare it to the full network dynamics. The mean-field model is capable of correctly predicting the average spontaneous activity levels in asynchronous irregular regimes similar to in vivo activity. It also captures the transient temporal response of the network to complex external inputs. Finally, the mean-field model is also able to quantitatively describe regimes where high- and low-activity states alternate (up-down state dynamics), leading to slow oscillations. We conclude that such mean-field models are biologically realistic in the sense that they can capture both spontaneous and evoked activity, and they naturally appear as candidates to build very large-scale models involving multiple brain areas.
We report a transition from asynchronous to oscillatory behaviour in balanced inhibitory networks for class I and II neurons with instantaneous synapses. Collective oscillations emerge for sufficiently connected networks. Their origin is understood in terms of a recently developed mean-field model, whose stable solution is a focus. Microscopic irregular firings, due to balance, trigger sustained oscillations by exciting the relaxation dynamics towards the macroscopic focus. The same mechanism induces in balanced excitatory-inhibitory networks quasi-periodic collective oscillations.
How does the brain link visual stimuli across space and time? Visual illusions provide an experimental paradigm to study these processes. When two stationary dots are flashed in close spatial and temporal succession, human observers experience a percept of apparent motion. Large spatiotemporal separation challenges the visual system to keep track of object identity along the apparent motion path, the so-called "correspondence problem." Here, we use voltage-sensitive dye imaging in primary visual cortex (V1) of awake monkeys to show that intracortical connections within V1 can solve this issue by shaping cortical dynamics to represent the illusory motion. We find that the appearance of the second stimulus in V1 creates a systematic suppressive wave traveling toward the retinotopic representation of the first. Using a computational model, we show that the suppressive wave is the emergent property of a recurrent gain control fed by the intracortical network. This suppressive wave acts to explain away ambiguous correspondence problems and contributes to precisely encode the expected motion velocity at the surface of V1. Together, these results demonstrate that the nonlinear dynamics within retinotopic maps can shape cortical representations of illusory motion. Understanding these dynamics will shed light on how the brain links sensory stimuli across space and time, by preformatting population responses for a straightforward read-out by downstream areas.
The dynamics of neural networks is often characterized by collective behavior and quasi-synchronous events, where a large fraction of neurons fire in short time intervals, separated by uncorrelated firing activity. These global temporal signals are crucial for brain functioning. They strongly depend on the topology of the network and on the fluctuations of the connectivity. We propose a heterogeneous mean–field approach to neural dynamics on random networks, that explicitly preserves the disorder in the topology at growing network sizes, and leads to a set of self-consistent equations. Within this approach, we provide an effective description of microscopic and large scale temporal signals in a leaky integrate-and-fire model with short term plasticity, where quasi-synchronous events arise. Our equations provide a clear analytical picture of the dynamics, evidencing the contributions of both periodic (locked) and aperiodic (unlocked) neurons to the measurable average signal. In particular, we formulate and solve a global inverse problem of reconstructing the in-degree distribution from the knowledge of the average activity field. Our method is very general and applies to a large class of dynamical models on dense random networks.
Sleep slow waves are known to participate in memory consolidation, yet slow waves occurring under anesthesia present no positive effects on memory. Here, we shed light onto this paradox, based on a combination of extracellular recordings in vivo, in vitro, and computational models. We find two types of slow waves, based on analyzing the temporal patterns of successive slow-wave events. The first type is consistently observed in natural slow-wave sleep, while the second is shown to be ubiquitous under anesthesia. Network models of spiking neurons predict that the two slow wave types emerge due to a different gain on inhibitory versus excitatory cells and that different levels of spike-frequency adaptation in excitatory cells can account for dynamical distinctions between the two types. This prediction was tested in vitro by varying adaptation strength using an agonist of acetylcholine receptors, which demonstrated a neuromodulatory switch between the two types of slow waves. Finally, we show that the first type of slow-wave dynamics is more sensitive to external stimuli, which can explain how slow waves in sleep and anesthesia differentially affect memory consolidation, as well as provide a link between slow-wave dynamics and memory diseases.
Oscillations are a hallmark of neural population activity in various brain regions with a spectrum covering a wide range of frequencies. Within this spectrum γ oscillations have received particular attention due to their ubiquitous nature and their correlation with higher brain functions. Recently, it has been reported that γ oscillations in the hippocampus of behaving rodents are segregated in two distinct frequency bands: slow and fast. These two γ rhythms correspond to different states of the network, but their origin has been not yet clarified.Here we show theoretically and numerically that a single inhibitory population can give rise to coexisting slow and fast γ rhythms corresponding to collective oscillations of a balanced spiking network. The slow and fast γ rhythms are generated via two different mechanisms: the fast one being driven by the coordinated tonic neural firing and the slow one by endogenous fluctuations due to irregular neural activity. We show that almost instantaneous stimulations can switch the collective γ oscillations from slow to fast and vice versa. Furthermore, to draw a connection with the experimental observations, we consider the modulation of the γ rhythms induced by a slower (θ) rhythm driving the network dynamics. In this context, depending on the strength of the forcing and the noise amplitude, we observe phase-amplitude and phase-phase coupling between the fast and slow γ oscillations and the θ forcing. Phase-phase coupling reveals on average different θ-phase preferences for the two coexisting γ rhythms joined to a wide cycle-to-cycle variability.
Biological neural networks produce information backgrounds of multi-scale spontaneous activity that become more complex in brain states displaying higher capacities for cognition, for instance, attentive awake versus asleep or anesthetized states. Here, we review brain state-dependent mechanisms spanning ion channel currents (microscale) to the dynamics of brain-wide, distributed, transient functional assemblies (macroscale). Not unlike how microscopic interactions between molecules underlie structures formed in macroscopic states of matter, using statistical physics, the dynamics of microscopic neural phenomena can be linked to macroscopic brain dynamics through mesoscopic scales. Beyond spontaneous dynamics, it is observed that stimuli evoke collapses of complexity, most remarkable over high dimensional, asynchronous, irregular background dynamics during consciousness. In contrast, complexity may not be further collapsed beyond synchrony and regularity characteristic of unconscious spontaneous activity. We propose that increased dimensionality of spontaneous dynamics during conscious states supports responsiveness, enhancing neural networks' emergent capacity to robustly encode information over multiple scales.
We investigate the occurrence of quasisynchronous events in a random network of excitatory leaky integrateand-fire neurons equipped with short-term plasticity. The dynamics is analyzed by monitoring both the evolution of global synaptic variables and, on a microscopic ground, the interspike intervals of the individual neurons. We find that quasisynchronous events are the result of a mixture of synchronized and unsynchronized motion, analogously to the emergence of synchronization in the Kuramoto model. In the present context, disorder is due to the random structure of the network and thereby vanishes for a diverging network size N (i.e., in the thermodynamic limit), when statistical fluctuations become negligible. Remarkably, the fraction of asynchronous neurons remains strictly larger than zero for arbitrarily large N. This is due to the presence of a robust homoclinic cycle in the self-generated synchronous dynamics. The nontrivial large-N behavior is confirmed by the anomalous scaling of the maximum Lyapunov exponent, which is strictly positive in a finite network and decreases as N −0.27. Finally, we have checked the robustness of this dynamical phase with respect to the addition of noise, applied to either the reset potential or the leaky current.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.