With the growing body of research on traumatic brain injury and spinal cord injury, computational neuroscience has recently focused its modeling efforts on neuronal functional deficits following mechanical loading. However, in most of these efforts, cell damage is generally only characterized by purely mechanistic criteria, functions of quantities such as stress, strain or their corresponding rates. The modeling of functional deficits in neurites as a consequence of macroscopic mechanical insults has been rarely explored. In particular, a quantitative mechanically based model of electrophysiological impairment in neuronal cells, Neurite, has only very recently been proposed. In this paper, we present the implementation details of this model: a finite difference parallel program for simulating electrical signal propagation along neurites under mechanical loading. Following the application of a macroscopic strain at a given strain rate produced by a mechanical insult, Neurite is able to simulate the resulting neuronal electrical signal propagation, and thus the corresponding functional deficits. The simulation of the coupled mechanical and electrophysiological behaviors requires computational expensive calculations that increase in complexity as the network of the simulated cells grows. The solvers implemented in Neurite—explicit and implicit—were therefore parallelized using graphics processing units in order to reduce the burden of the simulation costs of large scale scenarios. Cable Theory and Hodgkin-Huxley models were implemented to account for the electrophysiological passive and active regions of a neurite, respectively, whereas a coupled mechanical model accounting for the neurite mechanical behavior within its surrounding medium was adopted as a link between electrophysiology and mechanics. This paper provides the details of the parallel implementation of Neurite, along with three different application examples: a long myelinated axon, a segmented dendritic tree, and a damaged axon. The capabilities of the program to deal with large scale scenarios, segmented neuronal structures, and functional deficits under mechanical loading are specifically highlighted.
This article describes the environment used in the Computer Architecture
Most studies about the solar forecasting topic do not analyze and exploit the temporal and spatial components that are inherent to such a task. Furthermore, they mostly focus just on precision and not on other meaningful features, such as flexibility and robustness. With the current energy production trends, where many solar panels are distributed across city rooftops, there is a need to manage all this information simultaneously and to be able to add and remove sensors as needed. Likewise, robust models need to be able to cope with (inevitable) sensor failure and continue producing reliable predictions. Due to all of this, solar forecasting models need to be as decoupled as possible from the number of data sources that feed them and their geographical distribution, enabling also the reusability of the models. This article contributes with a family of Deep Learning models for solar irradiance forecasting complying with the aforementioned features, i.e. flexibility and robustness. In the first stage, several Artificial Neural Networks are trained as a basis for predicting solar irradiance on several locations at the same time. Thereupon, a family of models that work with irradiance maps thanks to Convolutional Long Short-Term Memory layers is presented, obtaining forecast skills between 7.4% and 41% (depending on the location and horizon) compared to the baseline. The latter family comes with flexibility and robustness features, which are required in large-scale Intelligent Environments, such as Smart Cities. Working with irradiance maps means that new sensors can be added (or removed) as needed, without requiring rebuilding the model. Experiments carried out show that sensor failures have a mild impact on the prediction error for several forecast horizons.
This paper presents a flexible and scalable approach to the parallelization of the computation of optical flow. This approach is based on data parallel distribution. Images are divided into several subimages processed by a software pipeline while respecting dependencies between computation stages. The parallelization has been implemented in three different infrastructures: shared, distributed memory, and hybrid to show its conceptual flexibility and scalability. A significant improvement in performance was obtained in all three cases. These versions have been used to compute the optical flow of video sequences taken in adverse conditions, with a moving camera and natural-light conditions, on board a conventional vehicle traveling on public roads. The parallelization adopted has been developed from the analysis of dependencies presented by the well-known Lucas-Kanade algorithm, using a sequential version developed at the University of Porto as the starting point.
Measures of functional connectivity are commonly employed in neuroimaging research. Among the most popular measures is the Synchronization Likelihood which provides a non-linear estimate of the statistical dependencies between the activity time courses of different brain areas. One aspect which has limited a wider use of this algorithm is the fact that it is very computationally and memory demanding. In the present work we propose new implementations and parallelizations of the Synchronization Likelihood algorithm with significantly better performance both in time and in memory use. As a result both the amount of required computational time is reduced by 3 orders of magnitude and the amount of memory needed for calculations is reduced by 2 orders of magnitude. This allows performing analyses that were not feasible before from a computational standpoint.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.