Single-molecule localization microscopy (SMLM) enables the production of high-resolution images by imaging spatially isolated fluorescent particles. Although challenging, the result of SMLM analysis lists the position of individual molecules, leading to a valuable quantification of the stoichiometry and spatial organization of molecular actors. Both the signal/noise ratio and the density (D frame ), i.e., the number of fluorescent particles per mm 2 per frame, have previously been identified as determining factors for reaching a given SMLM precision. Establishing a comprehensive theoretical study relying on these two parameters is therefore of central interest to delineate the achievable limits for accurate SMLM observations. Our study reports that in absence of prior knowledge of the signal intensity a, the density effect on particle localization is more prominent than that anticipated from theoretical studies performed at known a. A first limit appears when, under a low-density hypothesis (i.e., one-Gaussian fitting hypothesis), any fluorescent particle distant by less than $600 nm from the particle of interest biases its localization. In fact, all particles should be accounted for, even those dimly fluorescent, to ascertain unbiased localization of any surrounding particles. Moreover, even under a high-density hypothesis (i.e., multi-Gaussian fitting hypothesis), a second limit arises because of the impossible distinction of particles located too closely. An increase in D frame is thus likely to deteriorate the localization precision, the image reconstruction, and more generally the quantification accuracy. Our study firstly provides a density-signal/noise ratio space diagram for use as a guide in data recording toward reaching an achievable SMLM resolution. Additionally, it leads to the identification of the essential requirements for implementing UNLOC, a parameter-free and fast computing algorithm approaching the Cram er-Rao bound for particles at high-density per frame and without any prior knowledge of their intensity. UNLOC is available as an ImageJ plugin.
This paper presents a contextual algorithm for the computation of Baum's forward and backward probabilities, which are intensively used in the framework of Hidden Markov Chain (HMC) models. The method differs from the original algorithm since it only takes into account a neighborhood of limited length and not all the chain for computations. Comparative experiments with respect to the neighborhood size have been conducted on both Markovian (simulations) and not Markovian (images) data, by mean of supervised and unsupervised classifications.
International audienceThis work deals with unsupervised change detection in bi-date Synthetic Aperture Radar (SAR) images. Whatever the indicator of change used to compute the criterion image, e.g. log-ratio or Kullback-Leibler divergence between images, we have observed poor quality change maps for some events when using the Hidden Markov Chain (HMC) model we focus on in this work. The main reason comes from the stationary assumption involved in this model -and in most Markovian models such as Hidden Markov Random Fields-, which can not be justified in most observed scenes: changed areas are not necessarily stationary in the image. Besides the non-stationary Markov models proposed in the literature, the aim of this paper is to describe a pragmatic solution to tackle change detection stationarity by evaluating and comparing a 1D and a 2D window approaches. By moving the window through the criterion image, the process is able to produce a change map which can better exhibit non-stationary changes than the classical HMC applied directly on the whole criterion image. Special care is devoted to the estimation of the number of classes in each window, which can vary from one (no change) to three (positive change, negative change and no change) by using the corrected Akaike Information Criterion suited to small samples. The quality assessment of the proposed approaches is achieved with a pair of RADARSAT images bracketing the Mount Nyiragongo volcano eruption event in January 2002. The available ground truth confirms the effectiveness of the proposed approach compared to a classical HMC-based strategy
This work deals with unsupervised change detection in bi-date Synthetic Aperture Radar (SAR) images. Whatever the indicator of change used, e.g. log-ratio or Kullback-Leibler divergence, we have observed poor quality change maps for some events when using the Hidden Markov Chain (HMC) model we focus on in this work. The main reason comes from the stationary assumption involved in this model -and in most Markovian models such as Hidden Markov Random Fields-, which can not be justified in most observed scenes: changed areas are not necessarily stationary in the image. Besides the few non stationary Markov models proposed in the literature, the aim of this paper is to describe a pragmatic solution to tackle stationarity by using a sliding window strategy. In this algorithm, the criterion image is scanned pixel by pixel, and a classical HMC model is applied only on neighboring pixels. By moving the window through the image, the process is able to produce a change map which can better exhibit non stationary changes than the classical HMC applied directly on the whole criterion image. Special care is devoted to the estimation of the number of classes in each window, which can vary from one (no change) to three (positive change, negative change and no change) by using the corrected Akaike Information Criterion (AICc) suited to small samples. The quality assessment of the proposed approach is achieved with speckle-simulated images in which simulated changes is introduced. The windowed strategy is also evaluated with a pair of RADARSAT images bracketing the Nyiragongo volcano eruption event in January 2002. The available ground truth confirms the effectiveness of the proposed approach compared to a classical HMC-based strategy.
This article examines the plight of Tunisian 'Arabs' and 'Jews' during the critical period between 8 November 1942 and 20 May 1943 through the viewpoint of a local subaltern narrative, known in women's oral history as Year of the Typhus. Relying on interviews conducted with World War II women survivors, family archives, war memoirs and other military correspondence by the Allies, this article presents four main arguments. First, both the Jew and the Arab appear as a dehumanized Oriental Otherness in the war diaries of the Anglo-American soldiers and war correspondents who were stationed in North Africa during Operation Torch. Second, both the Allies and the Axis Powers were anti-Semitic, not just the Nazis. Third, the chapter of the Holocaust in North Africa is not separate from European history of colonialism in Africa and the Middle East. Although today the word anti-Semitism includes only the 'Jew', World War II archives show that it included various ethnicities, namely, the Jew, the Arab, the Negro/Black and the Italian. Finally, the relations between 'Jews' and 'Arabs' during the war were governed by the racial and social hierarchy characterizing all colonized societies as well as by the French colonial politics of nativization and denativization. This hierarchal system shaped not only how these two groups viewed each other, but also how they viewed themselves.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.