Simulating the brain-body-environment trinity in closed loop is an attractive proposal to investigate how perception, motor activity and interactions with the environment shape brain activity, and vice versa. The relevance of this embodied approach, however, hinges entirely on the modeled complexity of the various simulated phenomena. In this article, we introduce a software framework that is capable of simulating large-scale, biologically realistic networks of spiking neurons embodied in a biomechanically accurate musculoskeletal system that interacts with a physically realistic virtual environment. We deploy this framework on the high performance computing resources of the EBRAINS research infrastructure and we investigate the scaling performance by distributing computation across an increasing number of interconnected compute nodes. Our architecture is based on requested compute nodes as well as persistent virtual machines; this provides a high-performance simulation environment that is accessible to multi-domain users without expert knowledge, with a view to enable users to instantiate and control simulations at custom scale via a web-based graphical user interface. Our simulation environment, entirely open source, is based on the Neurorobotics Platform developed in the context of the Human Brain Project, and the NEST simulator. We characterize the capabilities of our parallelized architecture for large-scale embodied brain simulations through two benchmark experiments, by investigating the effects of scaling compute resources on performance defined in terms of experiment runtime, brain instantiation and simulation time. The first benchmark is based on a large-scale balanced network, while the second one is a multi-region embodied brain simulation consisting of more than a million neurons and a billion synapses. Both benchmarks clearly show how scaling compute resources improves the aforementioned performance metrics in a near-linear fashion. The second benchmark in particular is indicative of both the potential and limitations of a highly distributed simulation in terms of a trade-off between computation speed and resource cost. Our simulation architecture is being prepared to be accessible for everyone as an EBRAINS service, thereby offering a community-wide tool with a unique workflow that should provide momentum to the investigation of closed-loop embodiment within the computational neuroscience community.
The more we investigate the principles of motion learning in biological systems, the more we reveal the central role that body morphology plays in motion execution. Not only does anatomy define the kinematics and therefore the complexity of possible movements, but it now becomes clear that part of the computation required for motion control is offloaded to body dynamics (a phenomenon referred to as “Morphological Computation.”) Consequentially, a proper design of body morphology is essential to carry out meaningful simulations on motor control of robotic and musculoskeletal systems. The design should not be fixed for simulation experiments beforehand, but is a central research aspect in every motion learning experiment that requires continuous adaptation during the experimental phase. We herein introduce a plugin for the 3D modeling suite Blender that enables researchers to design morphologies for simulation experiments in, particularly but not restricted to, the Neurorobotics Platform. We include design capabilities for both musculoskeletal bodies, as well as robotic systems in the Robot Designer. Thereby, we hope to not only foster understanding of biological motions and enabling better robot designs, but enabling true Neurorobotic experiments that may consist of biomimetic models such as tendon-driven robot as a mix of both or a transition between both biology and technology. This plugin helps researchers design and parameterize models with a Graphical User Interface and thus simplifies and speeds up the overall design process.
To control highly-dynamic compliant motions such as running or hopping, vertebrates rely on reflexes and Central Pattern Generators (CPGs) as core strategies. However, decoding how much each strategy contributes to the control and how they are adjusted under different conditions is still a major challenge. To help solve this question, the present paper provides a comprehensive comparison of reflexes, CPGs and a commonly used combination of the two applied to a biomimetic robot. It leverages recent findings indicating that in mammals both control principles act within a low-dimensional control submanifold. This substantially reduces the search space of parameters and enables the quantifiable comparison of the different control strategies. The chosen metrics are motion stability and energy efficiency, both key aspects for the evolution of the central nervous system. We find that neither for stability nor energy efficiency it is favorable to apply the state-of-the-art approach of a continuously feedback-adapted CPG. In both aspects, a pure reflex is more effective, but the pure CPG allows easy signal alteration when needed. Additionally, the hardware experiments clearly show that the shape of a control signal has a strong influence on energy efficiency, while previous research usually only focused on frequency alignment. Both findings suggest that currently used methods to combine the advantages of reflexes and CPGs can be improved. In future research, possible combinations of the control strategies should be reconsidered, specifically including the modulation of the control signal's shape. For this endeavor, the presented setup provides a valuable benchmark framework to enable the quantitative comparison of different bioinspired control principles.
Although we can measure muscle activity and analyze their activation patterns, we understand little about how individual muscles affect the joint torque generated. It is known that they are controlled by circuits in the spinal cord, a system much less well-understood than the cortex. Knowing the contribution of the muscles toward a joint torque would improve our understanding of human limb control. We present a novel framework to examine the control of biomechanics using physics simulations informed by electromyography (EMG) data. These signals drive a virtual musculoskeletal model in the Neurorobotics Platform (NRP), which we then use to evaluate resulting joint torques. We use our framework to analyze raw EMG data collected during an isometric knee extension study to identify synergies that drive a musculoskeletal lower limb model. The resulting knee torques are used as a reference for genetic algorithms (GA) to generate new simulated activation patterns. On the platform the GA finds solutions that generate torques matching those observed. Possible solutions include synergies that are similar to those extracted from the human study. In addition, the GA finds activation patterns that are different from the biological ones while still producing the same knee torque. The NRP forms a highly modular integrated simulation platform allowing these in silico experiments. We argue that our framework allows for research of the neurobiomechanical control of muscles during tasks, which would otherwise not be possible.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.