Abstract:The analysis of complex systems frequently poses the challenge to distinguish correlation from causation. Statistical physics has inspired very promising approaches to search for correlations in time series; the transfer entropy in particular [1]. Now, methods from computational statistics can quantitatively assign significance to such correlation measures. In this study, we propose and apply a procedure to statistically assess transfer entropies by one-sided tests. We introduce to null models of vanishing cor… Show more
“…To perform a statistical assessment of the identified links, we create a null model that shuffles the neural events 75 (n = 200) by maintaining the distribution of IEIs (inter-event intervals). For a given data set (X, Y), we compute the so-called Z-score as follows: where µ(TE s ) is the mean value of a sample s under a null hypothesis of independence and σ(TE s ) is the respective standard deviation.…”
The need for in vitro models that mimic the human brain to replace animal testing and allow high-throughput screening has driven scientists to develop new tools that reproduce tissue-like features on a chip. Three-dimensional (3D) in vitro cultures are emerging as an unmatched platform that preserves the complexity of cell-to-cell connections within a tissue, improves cell survival, and boosts neuronal differentiation. In this context, new and flexible imaging approaches are required to monitor the functional states of 3D networks. Herein, we propose an experimental model based on 3D neuronal networks in an alginate hydrogel, a tunable wide-volume imaging approach, and an efficient denoising algorithm to resolve, down to single cell resolution, the 3D activity of hundreds of neurons expressing the calcium sensor GCaMP6s. Furthermore, we implemented a 3D co-culture system mimicking the contiguous interfaces of distinct brain tissues such as the cortical-hippocampal interface. The analysis of the network activity of single and layered neuronal co-cultures revealed cell-type-specific activities and an organization of neuronal subpopulations that changed in the two culture configurations. Overall, our experimental platform represents a simple, powerful and cost-effective platform for developing and monitoring living 3D layered brain tissue on chip structures with high resolution and high throughput.
“…To perform a statistical assessment of the identified links, we create a null model that shuffles the neural events 75 (n = 200) by maintaining the distribution of IEIs (inter-event intervals). For a given data set (X, Y), we compute the so-called Z-score as follows: where µ(TE s ) is the mean value of a sample s under a null hypothesis of independence and σ(TE s ) is the respective standard deviation.…”
The need for in vitro models that mimic the human brain to replace animal testing and allow high-throughput screening has driven scientists to develop new tools that reproduce tissue-like features on a chip. Three-dimensional (3D) in vitro cultures are emerging as an unmatched platform that preserves the complexity of cell-to-cell connections within a tissue, improves cell survival, and boosts neuronal differentiation. In this context, new and flexible imaging approaches are required to monitor the functional states of 3D networks. Herein, we propose an experimental model based on 3D neuronal networks in an alginate hydrogel, a tunable wide-volume imaging approach, and an efficient denoising algorithm to resolve, down to single cell resolution, the 3D activity of hundreds of neurons expressing the calcium sensor GCaMP6s. Furthermore, we implemented a 3D co-culture system mimicking the contiguous interfaces of distinct brain tissues such as the cortical-hippocampal interface. The analysis of the network activity of single and layered neuronal co-cultures revealed cell-type-specific activities and an organization of neuronal subpopulations that changed in the two culture configurations. Overall, our experimental platform represents a simple, powerful and cost-effective platform for developing and monitoring living 3D layered brain tissue on chip structures with high resolution and high throughput.
“…The logistic map was used because of its simplicity and well understood parameter dependent non-divergent long time behavior. It is commonly used in simulation frameworks as a model of choice (see [ 61 , 62 ] for examples in the context of TE and [ 59 , 60 ] for simulation of neuronal activity). Our model was composed of three main branches with four channels each and an additional channel to which every branch projects with a different interaction delay.…”
Transfer entropy (TE) provides a generalized and model-free framework to study Wiener-Granger causality between brain regions. Because of its nonparametric character, TE can infer directed information flow also from nonlinear systems. Despite its increasing number of applications in neuroscience, not much is known regarding the influence of common electrophysiological preprocessing on its estimation. We test the influence of filtering and downsampling on a recently proposed nearest neighborhood based TE estimator. Different filter settings and downsampling factors were tested in a simulation framework using a model with a linear coupling function and two nonlinear models with sigmoid and logistic coupling functions. For nonlinear coupling and progressively lower low-pass filter cut-off frequencies up to 72% false negative direct connections and up to 26% false positive connections were identified. In contrast, for the linear model, a monotonic increase was only observed for missed indirect connections (up to 86%). High-pass filtering (1 Hz, 2 Hz) had no impact on TE estimation. After low-pass filtering interaction delays were significantly underestimated. Downsampling the data by a factor greater than the assumed interaction delay erased most of the transmitted information and thus led to a very high percentage (67–100%) of false negative direct connections. Low-pass filtering increases the number of missed connections depending on the filters cut-off frequency. Downsampling should only be done if the sampling factor is smaller than the smallest assumed interaction delay of the analyzed network.
“…direction of information flow, which can be considered as a measure of potential causation from X to Y (Boba et al 2015). In contrast with GC, TE is not framed in terms of prediction but in terms of resolution of uncertainty (Barnett et al 2009).…”
Section: Causality As Information Flow Through Transfer Entropymentioning
The sustainable development goals (SDGs) launched by the United Nations (UN) set a new direction for development covering the environmental, economic, and social pillars. Given the complex and interdependent nature of the socioeconomic and environmental systems, however, understanding the cause-effect relationships between policy actions and their outcomes on SDGs remains as a challenge. We provide a systematic review of cause-effect analysis literature in the context of quantitative sustainability assessment. The cause-effect analysis literature in both social and natural sciences has significantly gained its breadth and depth, and some of the pioneering applications have begun to address sustainability challenges. We focus on randomized experiment studies, natural experiments, observational studies, and time-series methods, and the applicability of these approaches to quantitative sustainability assessment with respect to the plausibility of the assumptions, limitations and the data requirements. Despite the promising developments, however, we find that quantifying the sustainability consequences of a policy action, and providing unequivocal policy recommendations based on it is still a challenge. We recognize some of the key data requirements and assumptions necessary to design formal experiments as the bottleneck for conducting scientifically defensible cause-effect analysis in the context of quantitative sustainability assessment. Our study calls for the need of multi-disciplinary effort to develop an operational framework for quantifying the sustainability consequences of policy actions. In the meantime, continued efforts need to be made to advance other modeling platforms such as mechanistic models and simulation tools. We highlighted the importance of understanding and properly communicating the uncertainties associated with such models, regular monitoring and feedback on the consequences of policy actions to the modelers and decision-makers, and the use of what-if scenarios in the absence of well-formulated cause-effect analysis.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.