In order to localize the neural circuits involved in generating behaviors, it is necessary to assign activity onto anatomical maps of the nervous system. Using brain registration across hundreds of larval zebrafish, we have built an expandable open source atlas containing molecular labels and anatomical region definitions, the Z-Brain. Using this platform and immunohistochemical detection of phosphorylated-Extracellular signal-regulated kinase (ERK/MAPK) as a readout of neural activity, we have developed a system to create and contextualize whole brain maps of stimulus- and behavior-dependent neural activity. This MAP-Mapping (Mitogen Activated Protein kinase – Mapping) assay is technically simple, fast, inexpensive, and data analysis is completely automated. Since MAP-Mapping is performed on fish that are freely swimming, it is applicable to nearly any stimulus or behavior. We demonstrate the utility of our high-throughput approach using hunting/feeding, pharmacological, visual and noxious stimuli. The resultant maps outline hundreds of areas associated with behaviors.
SUMMARY Escape behaviors deliver organisms away from imminent catastrophe. Here, we characterize behavioral responses of freely swimming larval zebrafish to looming visual stimuli simulating predators. We report that the visual system alone can recruit lateralized, rapid escape motor programs, similar to those elicited by mechanosensory modalities. Two-photon calcium imaging of retino-recipient midbrain regions isolated the optic tectum as an important center processing looming stimuli, with ensemble activity encoding the critical image size determining escape latency. Furthermore, we describe activity in retinal ganglion cell terminals and superficial inhibitory interneurons in the tectum during looming and propose a model for how temporal dynamics in tectal periventricular neurons might arise from computations between these two fundamental constituents. Finally, laser ablations of hindbrain circuitry confirmed that visual and mechanosensory modalities share the same premotor output network. Together, we establish a circuit for the processing of aversive stimuli in the context of an innate visual behavior.
SUMMARY Detailed descriptions of brain-scale sensorimotor circuits underlying vertebrate behavior remain elusive. Recent advances in zebrafish neuroscience offer new opportunities to dissect such circuits via whole-brain imaging, behavioral analysis, functional perturbations, and network modeling. Here, we harness these tools to generate a brain-scale circuit model of the optomotor response, an orienting behavior evoked by visual motion. We show that such motion is processed by diverse neural response types distributed across multiple brain regions. To transform sensory input into action, these regions sequentially integrate eye- and direction-specific sensory streams, refine representations via interhemispheric inhibition, and demix locomotor instructions to independently drive turning and forward swimming. While experiments revealed many neural response types throughout the brain, modeling identified the dimensions of functional connectivity most critical for the behavior. We thus reveal how distributed neurons collaborate to generate behavior and illustrate a paradigm for distilling functional circuit models from whole-brain data.
In the absence of salient sensory cues to guide behavior, animals must still execute sequences of motor actions in order to forage and explore. How such successive motor actions are coordinated to form global locomotion trajectories is unknown. We mapped the structure of larval zebrafish swim trajectories in homogeneous environments and found that trajectories were characterized by alternating sequences of repeated turns to the left and to the right. Using whole-brain light-sheet imaging, we identified activity relating to the behavior in specific neural populations that we termed the anterior rhombencephalic turning region (ARTR). ARTR perturbations biased swim direction and reduced the dependence of turn direction on turn history, indicating that the ARTR is part of a network generating the temporal correlations in turn direction. We also find suggestive evidence for ARTR mutual inhibition and ARTR projections to premotor neurons. Finally, simulations suggest the observed turn sequences may underlie efficient exploration of local environments.DOI: http://dx.doi.org/10.7554/eLife.12741.001
Existing techniques for monitoring neural activity in awake, freely behaving vertebrates are invasive and difficult to target to genetically identified neurons. Here we describe the use of bioluminescence to non-invasively monitor the activity of genetically specified neurons in freely behaving zebrafish. Transgenic fish expressing the Ca2+-sensitive photoprotein GFP-apoAequorin (GA) in most neurons generated large and fast bioluminescent signals related to neural activity, neuroluminescence, that could be recorded continuously for many days. To test the limits of this technique, GA was specifically targeted to the hypocretin-positive neurons of the hypothalamus. We found that neuroluminescence generated by this group of ~20 neurons was associated with periods of increased locomotor activity and identified two classes of neural activity corresponding to distinct swim latencies. Thus, our neuroluminescence assay can report, with high temporal resolution and sensitivity, the activity of small subsets of neurons during unrestrained behavior.
The dynamics of living organisms are organized across many spatial scales. However, current cost-effective imaging systems can measure only a subset of these scales at once. We have created a scalable multi-camera array microscope (MCAM) that enables comprehensive high-resolution recording from multiple spatial scales simultaneously, ranging from structures that approach the cellular scale to large-group behavioral dynamics. By collecting data from up to 96 cameras, we computationally generate gigapixel-scale images and movies with a field of view over hundreds of square centimeters at an optical resolution of 18 µm. This allows us to observe the behavior and fine anatomical features of numerous freely moving model organisms on multiple spatial scales, including larval zebrafish, fruit flies, nematodes, carpenter ants, and slime mold. Further, the MCAM architecture allows stereoscopic tracking of the z-position of organisms using the overlapping field of view from adjacent cameras. Overall, by removing the bottlenecks imposed by single-camera image acquisition systems, the MCAM provides a powerful platform for investigating detailed biological features and behavioral processes of small model organisms across a wide range of spatial scales.
This paper experimentally examines different configurations of a multi-camera array microscope (MCAM) imaging technology. The MCAM is based upon a densely packed array of “micro-cameras” to jointly image across a large field-of-view (FOV) at high resolution. Each micro-camera within the array images a unique area of a sample of interest, and then all acquired data with 54 micro-cameras are digitally combined into composite frames, whose total pixel counts significantly exceed the pixel counts of standard microscope systems. We present results from three unique MCAM configurations for different use cases. First, we demonstrate a configuration that simultaneously images and estimates the 3D object depth across a 100×135mm2 FOV at approximately 20 µm resolution, which results in 0.15 gigapixels (GP) per snapshot. Second, we demonstrate an MCAM configuration that records video across a continuous 83×123mm2 FOV with twofold increased resolution (0.48 GP per frame). Finally, we report a third high-resolution configuration (2 µm resolution) that can rapidly produce 9.8 GP composites of large histopathology specimens.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.