Image deblurring is an important topic in imaging science. In this review, we consider together fluorescence microscopy and optical/infrared astronomy because of two common features: in both cases the imaging system can be described, with a sufficiently good approximation, by a convolution operator, whose kernel is the so-called point-spread function (PSF); moreover, the data are affected by photon noise, described by a Poisson process. This statistical property of the noise, that is common also to emission tomography, is the basis of maximum likelihood and Bayesian approaches introduced in the mid eighties. From then on, a huge amount of literature has been produced on these topics. This review is a tutorial and a review of a relevant part of this literature, including some of our previous contributions. We discuss the mathematical modeling of the process of image formation and detection, and we introduce the so-called Bayesian paradigm that provides the basis of the statistical treatment of the problem. Next, we describe and discuss the most frequently used algorithms as well as other approaches based on a different description of the Poisson noise. We conclude with a review of other topics related to image deblurring such as boundary effect correction, space-variant PSFs, super-resolution, blind deconvolution and multiple-image deconvolution.
Several methods based on different image models have been proposed and developed for image denoising. Some of them, such as total variation (TV) and wavelet thresholding, are based on the assumption of additive Gaussian noise. Recently the TV approach has been extended to the case of Poisson noise, a model describing the effect of photon counting in applications such as emission tomography, microscopy and astronomy. For the removal of this kind of noise we consider an approach based on a constrained optimization problem, with an objective function describing TV and other edge-preserving regularizations of the Kullback-Leibler divergence. We introduce a new discrepancy principle for the choice of the regularization parameter, which is justified by the statistical properties of the Poisson noise. For solving the optimization problem we propose a particular form of a general scaled gradient projection (SGP) method, recently introduced for image deblurring. We derive the form of the scaling from a decomposition of the gradient of the regularization functional into a positive and a negative part. The beneficial effect of the scaling is proved by means of numerical simulations, showing that the performance of the proposed form of SGP is superior to that of the most efficient gradient projection methods. An extended numerical analysis of the dependence of the solution on the regularization parameter is also performed to test the effectiveness of the proposed discrepancy principle.
In applications of imaging science, such as emission tomography, fluorescence microscopy and optical/infrared astronomy, image intensity is measured via the counting of incident particles (photons, γ-rays, etc). Fluctuations in the emission-counting process can be described by modeling the data as realizations of Poisson random variables (Poisson data). A maximum-likelihood approach for image reconstruction from Poisson data was proposed in the mid-1980s. Since the consequent maximization problem is, in general, ill-conditioned, various kinds of regularizations were introduced in the framework of the so-called Bayesian paradigm. A modification of the well-known Tikhonov regularization strategy results in the data-fidelity function being a generalized Kullback-Leibler divergence. Then a relevant issue is to find rules for selecting a proper value of the regularization parameter. In this paper we propose a criterion, nicknamed discrepancy principle for Poisson data, that applies to both denoising and deblurring problems and fits quite naturally the statistical properties of the data. The main purpose of the paper is to establish conditions, on the data and the imaging matrix, ensuring that the proposed criterion does actually provide a unique value of the regularization parameter for various classes of regularization functions. A few numerical experiments are performed to demonstrate its effectiveness. More extensive numerical analysis and comparison with other proposed criteria will be the object of future work.
We present two wide-field (%5 0 ; 3A5), diffraction-limited (k=D ' 0B5 at 10 m), broadband 10 and 20 m images of the Orion Nebula, plus six 7-13 m narrowband (k=Ák ' 1) images of the BN/ KL complex taken at the 3.8 m UKIRT telescope with the MPIA MAX camera. The wide-field images, centered on the Trapezium and BN/ KL regions, are mosaics of 35 00 ; 35 00 frames obtained with standard chopping and nodding techniques and reconstructed using a new restoration method developed for this project. They show the filamentary structure of the dust emission from the walls of the H ii region and reveal a new remarkable group of arclike structures %1 0 to the south of the Trapezium. The morphology of the Ney-Allen Nebula, produced by wind-wind interaction in the vicinity of the Trapezium stars, suggests a complex kinematical structure at the center of the cluster. We find indications that one of the most massive members of the cluster, the B0.5 V star 1 Ori D, is surrounded by a photoevaporated circumstellar disk. Among the four historic Trapezium OB stars, this is the only one without a binary companion, suggesting that stellar multiplicity and the presence of massive circumstellar disks may be mutually exclusive. In what concerns the BN / KL complex, we find evidence for extended optically thin silicate emission on top of the deep 10 m absorption feature. Assuming a simple two-component model, we map with '0B5 spatial resolution the foreground optical depth, color temperature, and mid-IR luminosity of the embedded sources. We resolve a conspicuous point source at the location of the IRc2-A knot, approximately 0B5 north of the deeply embedded H ii region ''I.'' We analyze the spectral profile of the 10 m silicate absorption feature and find indication for grain crystallization in the harsh nebular environment. In the OMC-1 South region, we detect several point sources and discuss their association with the mass-loss phenomenology observed at optical and millimeter wavelengths. Finally, we list the position and photometry of 177 point sources, the large majority of which are detected for the first time in the mid-IR. Twenty-two of them lack a counterpart at shorter wavelengths and are therefore candidates for deeply embedded protostars. The comparison of photometric data obtained at two different epochs reveals that source variability at 10 m is present up to a level of %1 mag on a timescale of $2 yr. With the possible exception of a pair of OB stars, all point sources detected at shorter wavelengths display 10 m emission well above the photospheric level, which we attribute to disk circumstellar emission. The recent model of Robberto et al. provides the simplest explanation for the observed mid-IR excess.
Background This pilot study was designed to develop a fully automatic and quantitative scoring system of B-lines (QLUSS: quantitative lung ultrasound score) involving the pleural line and to compare it with previously described semi-quantitative scores in the measurement of extravascular lung water as determined by standard thermo-dilution. Methods This was a prospective observational study of 12 patients admitted in the intensive care unit with acute respiratory distress and each provided with 12 lung ultrasound (LUS) frames. Data collected from each patient consisted in five different scores, four semi-quantitative (nLUSS, cLUSS, qLUSS, %LUSS) and quantitative scores (QLUSS). The association between LUS scores and extravascular lung water (EVLW) was determined by simple linear regression (SLR) and robust linear regression (RLR) methods. A correlation analysis between the LUS scores was performed by using the Spearman rank test. Inter-observer variability was tested by computing intraclass correlation coefficient (ICC) in two-way models for agreement, basing on scores obtained by different raters blinded to patients’ conditions and clinical history. Results In the SLR, QLUSS showed a stronger association with EVLW ( R 2 = 0.57) than cLUSS ( R 2 = 0.45) and nLUSS ( R 2 = 0.000), while a lower association than qLUSS ( R 2 = 0.85) and %LUSS ( R 2 = 0.72) occurred. By applying RLR, QLUSS showed an association for EVLW ( R 2 = 0.86) comparable to qLUSS ( R 2 = 0.85) and stronger than %LUSS ( R 2 = 0.72) . QLUSS was significantly correlated with qLUSS ( r = 0.772; p = 0.003) and %LUSS ( r = 0.757; p = 0.005), but not with cLUSS ( r = 0.561; p = 0.058) and nLUSS ( r = 0.105; p = 0.744). Moreover, QLUSS showed the highest ICC (0.998; 95%CI from 0.996 to 0.999) among the LUS scores. Conclusions This study demonstrates that computer-aided scoring of the pleural line percentage affected by B-lines has the potential to assess EVLW. QLUSS may have a significant impact, once validated with a larger dataset composed by multiple real-time frames. This approach has the potentials to be advantageous in terms of faster data analysis and applicability to large sets of data without increased costs. On the contrary, it is not useful in pleural effusion or consolidations. Electronic supplementary material ...
Abstract. In this paper we propose a solution to the problem of reducing the boundary effects (ripples) in the deconvolution of astronomical images. The approach applies to the Richardson-Lucy method (RLM), namely the most frequently used deconvolution method in Astronomy, and is based on the idea of using RLM for attempting a reconstruction of the astronomical target in a domain broader than that of the detected image. Even if, in general, the reconstruction outside the image domain is not reliable, this approach, in a sense, is letting RLM to choose the appropriate boundary conditions and, as a consequence, the reconstruction inside the domain is considerably improved. We propose a simple implementation of this approach, allowing a reduction of its computational burden. Numerical experiments indicate that it is possible to obtain excellent results. Extensions and applications of the method are briefly discussed.
Abstract.A complete exploitation of the imaging properties of the Large Binocular Telescope (LBT) will require a generalization of the restoration methods which apply to the case of a single image. Several different observations must be combined to obtain a high-resolution representation of a given target. The purpose of this paper is to extend to this problem some of the most used restoration methods, including linear methods such as Tikhonov regularization as well as iterative regularization methods providing positive solution. The proposed methods are implemented and tested on simulated LBT images of diffuse and point-like objects. The results are discussed both from the point of view of the accuracy and from that of the computational efficiency, because LBT images may contain, in principle, up to 10 8 pixels.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.