Pushing the frontier of fluorescence microscopy requires the design of enhanced fluorophores with finely tuned properties. We recently discovered that incorporation of four-membered azetidine rings into classic fluorophore structures elicits substantial increases in brightness and photostability, resulting in the ‘Janelia Fluor’ (JF) series of dyes. Here, we refine and extend this strategy, showing that incorporation of 3-substituted azetidine groups allows rational tuning of the spectral and chemical properties with unprecedented precision. This strategy yields a palette of new fluorescent and fluorogenic labels with excitation ranging from blue to the far-red with utility in cells, tissue, and animals.
The comprehensive reconstruction of cell lineages in complex multicellular organisms is a central goal of developmental biology. We present an open-source computational framework for the segmentation and tracking of cell nuclei with high accuracy and speed. We demonstrate its (i) generality by reconstructing cell lineages in four-dimensional, terabyte-sized image data sets of fruit fly, zebrafish and mouse embryos acquired with three types of fluorescence microscopes, (ii) scalability by analyzing advanced stages of development with up to 20,000 cells per time point at 26,000 cells min(-1) on a single computer workstation and (iii) ease of use by adjusting only two parameters across all data sets and providing visualization and editing tools for efficient data curation. Our approach achieves on average 97.0% linkage accuracy across all species and imaging modalities. Using our system, we performed the first cell lineage reconstruction of early Drosophila melanogaster nervous system development, revealing neuroblast dynamics throughout an entire embryo.
A subset of Drosophila neurons that expresses crustacean cardioactive peptide (CCAP) has been shown previously to make the hormone bursicon, which is required for cuticle tanning and wing expansion after eclosion. Here we present evidence that CCAP-expressing neurons (N CCAP ) consist of two functionally distinct groups, one of which releases bursicon into the hemolymph and the other of which regulates its release. The first group, which we call N CCAP -c929, includes 14 bursicon-expressing neurons of the abdominal ganglion that lie within the expression pattern of the enhancer-trap line c929-Gal4. We show that suppression of activity within this group blocks bursicon release into the hemolymph together with tanning and wing expansion. The second group, which we call N CCAP -R, consists of N CCAP neurons outside the c929-Gal4 pattern. Because suppression of synaptic transmission and protein kinase A (PKA) activity throughout N CCAP , but not in N CCAP -c929, also blocks tanning and wing expansion, we conclude that neurotransmission and PKA are required in N CCAP -R to regulate bursicon secretion from N CCAP -c929. Enhancement of electrical activity in N CCAP -R by expression of the bacterial sodium channel NaChBac also blocks tanning and wing expansion and leads to depletion of bursicon from central processes. NaChBac expression in N CCAP -c929 is without effect, suggesting that the abdominal bursicon-secreting neurons are likely to be silent until stimulated to release the hormone. Our results suggest that N CCAP form an interacting neuronal network responsible for the regulation and release of bursicon and suggest a model in which PKA-mediated stimulation of inputs to normally quiescent bursicon-expressing neurons activates release of the hormone.
Optimal image quality in light-sheet microscopy requires a perfect overlap between the illuminating light sheet and the focal plane of the detection objective. However, mismatches between the light-sheet and detection planes are common owing to the spatiotemporally varying optical properties of living specimens. Here we present the AutoPilot framework, an automated method for spatiotemporally adaptive imaging that integrates (i) a multi-view light-sheet microscope capable of digitally translating and rotating light-sheet and detection planes in three dimensions and (ii) a computational method that continuously optimizes spatial resolution across the specimen volume in real time. We demonstrate long-term adaptive imaging of entire developing zebrafish (Danio rerio) and Drosophila melanogaster embryos and perform adaptive whole-brain functional imaging in larval zebrafish. Our method improves spatial resolution and signal strength two to five-fold, recovers cellular and sub-cellular structures in many regions that are not resolved by non-adaptive imaging, adapts to spatiotemporal dynamics of genetically encoded fluorescent markers and robustly optimizes imaging performance during large-scale morphogenetic changes in living organisms.
We present the Real-time Accurate Cell-shape Extractor (RACE), a high-throughput image analysis framework for automated three-dimensional cell segmentation in large-scale images. RACE is 55-330 times faster and 2-5 times more accurate than state-of-the-art methods. We demonstrate the generality of RACE by extracting cell-shape information from entire Drosophila, zebrafish, and mouse embryos imaged with confocal and light-sheet microscopes. Using RACE, we automatically reconstructed cellular-resolution tissue anisotropy maps across developing Drosophila embryos and quantified differences in cell-shape dynamics in wild-type and mutant embryos. We furthermore integrated RACE with our framework for automated cell lineaging and performed joint segmentation and cell tracking in entire Drosophila embryos. RACE processed these terabyte-sized datasets on a single computer within 1.4 days. RACE is easy to use, as it requires adjustment of only three parameters, takes full advantage of state-of-the-art multi-core processors and graphics cards, and is available as open-source software for Windows, Linux, and Mac OS.
Understanding how the brain works in tight concert with the rest of the central nervous system (CNS) hinges upon knowledge of coordinated activity patterns across the whole CNS. We present a method for measuring activity in an entire, non-transparent CNS with high spatiotemporal resolution. We combine a light-sheet microscope capable of simultaneous multi-view imaging at volumetric speeds 25-fold faster than the state-of-the-art, a whole-CNS imaging assay for the isolated Drosophila larval CNS and a computational framework for analysing multi-view, whole-CNS calcium imaging data. We image both brain and ventral nerve cord, covering the entire CNS at 2 or 5 Hz with two- or one-photon excitation, respectively. By mapping network activity during fictive behaviours and quantitatively comparing high-resolution whole-CNS activity maps across individuals, we predict functional connections between CNS regions and reveal neurons in the brain that identify type and temporal state of motor programs executed in the ventral nerve cord.
Imaging fast cellular dynamics across large specimens requires high resolution in all dimensions, high imaging speeds, good physical coverage and low photo-damage. To meet these requirements, we developed isotropic multiview (IsoView) light-sheet microscopy, which rapidly images large specimens via simultaneous light-sheet illumination and fluorescence detection along four orthogonal directions. Combining these four views by means of high-throughput multiview deconvolution yields images with high resolution in all three dimensions. We demonstrate whole-animal functional imaging of Drosophila larvae at a spatial resolution of 1.1-2.5 μm and temporal resolution of 2 Hz for several hours. We also present spatially isotropic whole-brain functional imaging in Danio rerio larvae and spatially isotropic multicolor imaging of fast cellular dynamics across gastrulating Drosophila embryos. Compared with conventional light-sheet microscopy, IsoView microscopy improves spatial resolution at least sevenfold and decreases resolution anisotropy at least threefold. Compared with existing high-resolution light-sheet techniques, IsoView microscopy effectively doubles the penetration depth and provides subsecond temporal resolution for specimens 400-fold larger than could previously be imaged.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.