It is assumed that synaptic strengthening and weakening balance throughout learning to avoid runaway potentiation and memory interference. However, energetic and informational considerations suggest that potentiation should occur primarily during wake, when animals learn, and depression should occur during sleep. We measured 6,920 synapses in mouse motor and sensory cortices using 3D-electron microscopy. The axon-spine interface (ASI) decreased ~18% after sleep compared with wake. This decrease was proportional to ASI size, which is indicative of scaling. Scaling was selective, sparing synapses that were large and lacked recycling endosomes. Similar scaling occurred for spine head volume, suggesting a distinction between weaker, more plastic synapses (~80%) and stronger, more stable synapses. These results support the hypothesis that a core function of sleep is to renormalize overall synaptic strength increased by wake.
Integrated information theory provides a mathematical framework to fully characterize the cause-effect structure of a physical system. Here, we introduce PyPhi, a Python software package that implements this framework for causal analysis and unfolds the full cause-effect structure of discrete dynamical systems of binary elements. The software allows users to easily study these structures, serves as an up-to-date reference implementation of the formalisms of integrated information theory, and has been applied in research on complexity, emergence, and certain biological questions. We first provide an overview of the main algorithm and demonstrate PyPhi’s functionality in the course of analyzing an example system, and then describe details of the algorithm’s design and implementation. PyPhi can be installed with Python’s package manager via the command ‘’ on Linux and macOS systems equipped with Python 3.4 or higher. PyPhi is open-source and licensed under the GPLv3; the source code is hosted on GitHub at https://github.com/wmayner/pyphi. Comprehensive and continually-updated documentation is available at https://pyphi.readthedocs.io. The mailing list can be joined at https://groups.google.com/forum/#!forum/pyphi-users. A web-based graphical interface to the software is available at http://integratedinformationtheory.org/calculate.html.
Actual causation is concerned with the question "what caused what?" Consider a transition between two states within a system of interacting elements, such as an artificial neural network, or a biological brain circuit. Which combination of synapses caused the neuron to fire? Which image features caused the classifier to misinterpret the picture? Even detailed knowledge of the system's causal network, its elements, their states, connectivity, and dynamics does not automatically provide a straightforward answer to the "what caused what?" question. Counterfactual accounts of actual causation based on graphical models, paired with system interventions, have demonstrated initial success in addressing specific problem cases in line with intuitive causal judgments. Here, we start from a set of basic requirements for causation (realization, composition, information, integration, and exclusion) and develop a rigorous, quantitative account of actual causation that is generally applicable to discrete dynamical systems. We present a formal framework to evaluate these causal requirements that is based on system interventions and partitions, and considers all counterfactuals of a state transition. This framework is used to provide a complete causal account of the transition by identifying and quantifying the strength of all actual causes and effects linking the two consecutive system states. Finally, we examine several exemplary cases and paradoxes of causation and show that they can be illuminated by the proposed framework for quantifying actual causation.MSC 2010 subject classifications: Primary 62-09; secondary 60-J10.
During non-rapid eye-movement (NREM) sleep, cortical and thalamic neurons oscillate every second or so between ON periods, characterized by membrane depolarization and wake-like tonic firing, and OFF periods, characterized by membrane hyperpolarization and neuronal silence. Cortical slow waves, the hallmark of NREM sleep, reflect near-synchronous OFF periods in cortical neurons. However, the mechanisms triggering such OFF periods are unclear, as there is little evidence for somatic inhibition. We studied cortical inhibitory interneurons that express somatostatin (SOM), because ∼70% of them are Martinotti cells that target diffusely layer I and can block excitatory transmission presynaptically, at glutamatergic terminals, and postsynaptically, at apical dendrites, without inhibiting the soma. In freely moving male mice, we show that SOM+ cells can fire immediately before slow waves and their optogenetic stimulation during ON periods of NREM sleep triggers long OFF periods. Next, we show that chemogenetic activation of SOM+ cells increases slow-wave activity (SWA), slope of individual slow waves, and NREM sleep duration; whereas their chemogenetic inhibition decreases SWA and slow-wave incidence without changing time spent in NREM sleep. By contrast, activation of parvalbumin+ (PV+) cells, the most numerous population of cortical inhibitory neurons, greatly decreases SWA and cortical firing, triggers short OFF periods in NREM sleep, and increases NREM sleep duration. Thus SOM+ cells, but not PV+ cells, are involved in the generation of sleep slow waves. Whether Martinotti cells are solely responsible for this effect, or are complemented by other classes of inhibitory neurons, remains to be investigated. Cortical slow waves are a defining feature of non-rapid eye-movement (NREM) sleep and are thought to be important for many of its restorative benefits. Yet, the mechanism by which cortical neurons abruptly and synchronously cease firing, the neuronal basis of the slow wave, remains unknown. Using chemogenetic and optogenetic approaches, we provide the first evidence that links a specific class of inhibitory interneurons-somatostatin-positive cells-to the generation of slow waves during NREM sleep in freely moving mice.
SummaryStandard techniques for studying biological systems largely focus on their dynamical, or, more recently, their informational properties, usually taking either a reductionist or holistic perspective. Yet, studying only individual system elements or the dynamics of the system as a whole disregards the organisational structure of the system -whether there are subsets of elements with joint causes or effects, and whether the system is strongly integrated or composed of several loosely interacting components. Integrated information theory (IIT), offers a theoretical framework to (1) investigate the compositional cause-effect structure of a system, and to (2) identify causal borders of highly integrated elements comprising local maxima of intrinsic cause-effect power. Here we apply this comprehensive causal analysis to a Boolean network model of the fission yeast (Schizosaccharomyces pombe) cell-cycle. We demonstrate that this biological model features a non-trivial causal architecture, whose discovery may provide insights about the real cell cycle that could not be gained from holistic or reductionist approaches. We also show how some specific properties of this underlying causal architecture relate to the biological notion of autonomy. Ultimately, we suggest that analysing the causal organisation of a system, including key features like intrinsic control and stable causal borders, should prove relevant for distinguishing life from non-life, and thus could also illuminate the origin of life problem.
Sleep has been hypothesized to rebalance overall synaptic strength after ongoing learning during waking leads to net synaptic potentiation. If so, because synaptic strength and size are correlated, synapses on average should be larger after wake and smaller after sleep. This prediction was recently confirmed in mouse cerebral cortex using serial block-face electron microscopy (SBEM). However, whether these findings extend to other brain regions is unknown. Moreover, sleep deprivation by gentle handling was reported to produce hippocampal spine loss, raising the question of whether synapse size and number are differentially affected by sleep and waking. Here we applied SBEM to measure axon-spine interface (ASI), the contact area between pre-synapse and post-synapse, and synapse density in CA1 stratum radiatum. Adolescent YFP-H mice were studied after 6-8 h of sleep (S ϭ 6), spontaneous wake at night (W ϭ 4) or wake enforced during the day by novelty exposure (EW ϭ 4; males/females balanced). In each animal Ն425 ASIs were measured and synaptic vesicles were counted in ϳ100 synapses/mouse. Reconstructed dendrites included many small, nonperforated synapses and fewer large, perforated synapses. Relative to S, ASI sizes in perforated synapses shifted toward higher values after W and more so after EW. ASI sizes in nonperforated synapses grew after EW relative to S and W, and so did their density. ASI size correlated with presynaptic vesicle number but the proportion of readily available vesicles decreased after EW, suggesting presynaptic fatigue. Thus, CA1 synapses undergo changes consistent with sleep-dependent synaptic renormalization and their number increases after extended wake.
The Integrated Information Theory (IIT) of consciousness starts from essential phenomenological properties, which are then translated into postulates that any physical system must satisfy in order to specify the physical substrate of consciousness. We recently introduced an information measure (Barbosa et al., 2020) that captures three postulates of IIT—existence, intrinsicality and information—and is unique. Here we show that the new measure also satisfies the remaining postulates of IIT—integration and exclusion— and create the framework that identifies maximally irreducible mechanisms. These mechanisms can then form maximally irreducible systems, which in turn will specify the physical substrate of conscious experience.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.