the quality of super-resolution images obtained by singlemolecule localization microscopy (smlm) depends largely on the software used to detect and accurately localize point sources. in this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. these metrics reflect the various tradeoffs of smlm software packages and help users to choose the software that fits their needs.We have conducted a large-scale comparative study of software packages developed in the context of SMLM, including recently developed algorithms. We designed realistic data that are generic and cover a broad range of experimental conditions and compared the software packages using a multiple-criterion quantitative assessment that is based on a known ground truth.Our study is based on the active participation of developers of SMLM software. More than 30 groups have participated so far, and the study is still under way. We provide participants access to our benchmark data as an ongoing public challenge. Participants run their own software on our data and report their list of localized particles for evaluation. The results of the challenge are accessible online and updated regularly.SMLM was demonstrated in 2006, independently by three research groups 1-3 , and has enabled subsequent breakthroughs in diverse fields 4,5 . SMLM can resolve biological structures at the nanometer scale (typically 20 nm lateral resolution), circumventing Abbe's diffraction limit. At the cost of a relatively simple setup 6,7 , it has opened exciting new opportunities in life science research 8,9 .The underlying principle of SMLM is the sequential imaging of sparse subsets of fluorophores distributed over thousands of frames, to populate a high-density map of fluorophore positions. Such large data sets require automated image-analysis algorithms to detect and precisely infer the position of individual fluorophore, taking advantage of their separation in space and time.The acquired data cannot be visualized directly; further computerized image-reconstruction methods are required. These typically comprise four steps: preprocessing, detection, localization and rendering. Preprocessing reduces the effects of the background and noise; detection identifies potential molecule candidates in each frame; localization performs a subpixel refine...
SummaryLocalization microscopy relies on computationally efficient Gaussian approximations of the point spread function for the calculation of fluorophore positions. Theoretical predictions show that under specific experimental conditions, localization accuracy is significantly improved when the localization is performed using a more realistic model. Here, we show how this can be achieved by considering three-dimensional (3-D) point spread function models for the wide field microscope. We introduce a least-squares point spread function fitting framework that utilizes the Gibson and Lanni model and propose a computationally efficient way for evaluating its derivative functions. We demonstrate the usefulness of the proposed approach with algorithms for particle localization and defocus estimation, both implemented as plugins for ImageJ.
Super resolution microscopy such as STORM and (F)PALM is now a well known method for biological studies at the nanometer scale. However, conventional imaging schemes based on sparse activation of photo-switchable fluorescent probes have inherently slow temporal resolution which is a serious limitation when investigating live-cell dynamics. Here, we present an algorithm for high-density super-resolution microscopy which combines a sparsity-promoting formulation with a Taylor series approximation of the PSF. Our algorithm is designed to provide unbiased localization on continuous space and high recall rates for high-density imaging, and to have orders-of-magnitude shorter run times compared to previous high-density algorithms. We validated our algorithm on both simulated and experimental data, and demonstrated live-cell imaging with temporal resolution of 2.5 seconds by recovering fast ER dynamics.
Abstract-This paper is devoted to the characterization of an extended family of continuous-time autoregressive moving average (CARMA) processes that are solutions of stochastic differential equations driven by white Lévy innovations. These are completely specified by: 1) a set of poles and zeros that fixes their correlation structure and 2) a canonical infinitely divisible probability distribution that controls their degree of sparsity (with the Gaussian model corresponding to the least sparse scenario). The generalized CARMA processes are either stationary or nonstationary, depending on the location of the poles in the complex plane. The most basic nonstationary representatives (with a single pole at the origin) are the Lévy processes, which are the non-Gaussian counterparts of Brownian motion. We focus on the general analog-to-discrete conversion problem and introduce a novel spline-based formalism that greatly simplifies the derivation of the correlation properties and joint probability distributions of the discrete versions of these processes. We also rely on the concept of generalized increment process, which suppresses all long range dependencies, to specify an equivalent discrete-domain innovation model. A crucial ingredient is the existence of a minimally supported function associated with the whitening operator L; this B-spline, which is fundamental to our formulation, appears in most of our formulas, both at the level of the correlation and the characteristic function. We make use of these discrete-domain results to numerically generate illustrative examples of sparse signals that are consistent with the continuousdomain model.
Abstract-The problem of estimating continuous-domain autoregressive moving-average processes from sampled data is considered. The proposed approach incorporates the sampling process into the problem formulation while introducing exponential models for both the continuous and the sampled processes. We derive an exact evaluation of the discrete-domain power-spectrum using exponential B-splines and further suggest an estimation approach that is based on digitally filtering the available data. The proposed functional, which is related to Whittle's likelihood function, exhibits several local minima that originate from aliasing. The global minimum, however, corresponds to a maximum-likelihood estimator, regardless of the sampling step. Experimental results indicate that the proposed approach closely follows the Cramér-Rao bound for various aliasing configurations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.