Speckle is a granular disturbance, usually modeled as a multiplicative noise, that affects synthetic aperture radar (SAR) images, as well as all coherent images. Over the last three decades, several methods have been proposed for the reduction of speckle, or despeckling, in SAR images. Goal of this paper is making a comprehensive review of despeckling methods since their birth, over thirty years ago, highlighting trends and changing approaches over years. The concept of fully developed speckle is explained. Drawbacks of homomorphic filtering are pointed out. Assets of multiresolution despeckling, as opposite to spatial-domain despeckling, are highlighted. Also advantages of undecimated, or stationary, wavelet transforms over decimated ones are discussed. Bayesian estimators and probability density function (pdf) models in both spatial and multiresolution domains are reviewed. Scale-space varying pdf models, as opposite to scale varying models, are promoted. Promising methods following non-Bayesian approaches, like nonlocal (NL) filtering and total variation (TV) regularization, are reviewed and compared to spatial-and wavelet-domain Bayesian filters. Both established and new trends for assessment of despeckling are presented. A few experiments on simulated data and real COSMO-SkyMed SAR images highlight, on one side the cost-performance tradeoff of the different methods, on the other side the effectiveness of solutions purposely designed for SAR heterogeneity and not fully developed speckle. Eventually, upcoming methods based on new concepts of signal processing, like compressive sensing, are foreseen as a new generation of despeckling, after spatial-domain and multiresolution-domain methods.
In the last decades, several methods have been developed for despeckling synthetic aperture radar (SAR) images. A considerable number of them have been derived under the assumption of a fully developed speckle model in which the multiplicative speckle noise is supposed to be a white process. Unfortunately, the transfer function of SAR acquisition systems can introduce a statistical correlation which decreases the despeckling efficiency of such filters.In this work, a whitening method is proposed for processing a complex image acquired by a SAR system. We demonstrate that the proposed approach lets classical despeckling algorithms to be successfully applied. First, we perform an estimation of the SAR system frequency response based on some statistical properties of the acquired image and by using realistic assumptions. Then, a decorrelation process is applied on the acquired image, taking into account the presence of point targets. Finally, the image is despeckled. The experimental results show that the despeckling filters achieve better performance when they are preceded by the proposed whitening method; furthermore, the radiometric characteristics of the image are preserved.
The current paper presents a system for the dynamic simulation of the human hand. The simulation of the human hand offers the capability to acquire handshapes that correspond to letters of the finger alphabet, enabling an integrated representation of words and sentences. The hand model is designed using the Autodesk Inventor TM and Autodesk AutoCad TM design environments. The user is able to type words or sentences which are dynamically translated into postures according to the finger alphabet. The system is based on the physiometric characteristics of an average human hand. High precision design is utilized in every part through integration of all the necessary functionalities needed to perform the movements required. The system has been tested on more than 500 words with a letter representation success rate in the range of 95-97%.
The undecimated wavelet transform and the maximum a posteriori (MAP) criterion have been applied to the problem of SAR image despeckling. The MAP solution is based on the assumption that wavelet coefficients have a known distribution. In previous works, the generalized Gaussian (GG) function has been successfully employed. Furthermore, despeckling methods can be improved by using a classification of wavelet coefficients according to their texture energy. A major drawback of using the GG distribution is the high computational cost, since the MAP solution can be found only numerically. In this work, a new modeling of the statistics of wavelet coefficients is proposed. Observations of the estimated GG shape parameters relative to the reflectivity and to the speckle noise suggest that their distributions can be approximated as a Laplacian and a Gaussian function, respectively. Under these hypotheses, a closed form solution of the MAP estimation problem can be achieved. As for the GG case, classification of wavelet coefficients according to their texture content may be exploited also in the proposed method. Experimental results show that the fast MAP estimator based on the Laplacian-Gaussian assumption and on classification of coefficients reaches almost the same performances as the GG version in terms of speckle removal, with a gain in computational cost of about one order of magnitude.Index Terms-Despeckling, synthetic aperture radar (SAR) images, undecimated wavelet transform (UDWT), maximum aposteriori probability (MAP) estimation. I. INTRODUCTIONS PECKLE removal is a major concern in the analysis of synthetic aperture radar (SAR) images. Speckle noise is a granular disturbance that affects the observed reflectivity. Usually, it is modeled as a multiplicative noise: this nonlinear behavior makes the process of original information retrieval a nontrivial task [1]. In recent years, multiresolution analysis tools have been successfully applied to despeckling [2]-[5]. Several solutions were proposed based on the maximum aposteriori probability (MAP) criterium and different distributions: the Γ-distribution [4], the α-stable distribution [2], the Pearson system of distributions [3], the generalized Gaussian (GG) [6] [7], just to mention some examples.In [6], it has been shown that the MAP criterion in the undecimated wavelet domain, associated with the GG distribution, leads to the following procedure: 1) estimation of
Template-Based reverse engineering approaches represent a relatively poorly explored strategy in the field of CAD reconstruction from polygonal models. Inspired by recent works suggesting the possibility/opportunity of exploiting a parametric description (i.e. CAD template) of the object to be reconstructed in order to retrieve a meaningful digital representation, a novel reverse engineering approach for the reconstruction of CAD models starting from 3D mesh data is proposed. The reconstruction process is performed relying on a CAD template, whose feature tree and geometric constraints are defined according to the a priori information on the physical object. The CAD template is fitted upon the mesh data, optimizing its dimensional parameters and positioning/orientation by means of a particle swarm optimization algorithm. As a result, a parametric CAD model that perfectly fulfils the imposed geometric relations is produced and a feature tree, defining an associative modelling history, is available to the reverse engineer. The proposed implementation exploits a cooperation between a CAD software package (Siemens NX) and a numerical software environment (MATLAB). Five reconstruction tests, covering both synthetic and real-scanned mesh data, are presented and discussed in the manuscript; the results are finally compared with models generated by state of the art reverse engineering software and key aspects to be addressed in future work are hinted at. Highlights A novel CAD reconstruction method fitting a CAD template model to mesh data. A feature-based parametric-associative modelling history is retrieved. Fitting process is controlled by a Particle Swarm Optimization algorithm. Accuracy of reconstructed models is comparable/better than state of the art results. Computational costs and required time are at the moment considerable.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.