This paper addresses the problem of recovering a super-resolved image from a set of warped blurred and decimated versions thereof. Several algorithms have already been proposed for the solution of this general problem. In this paper, we concentrate on a special case where the warps are pure translations, the blur is space invariant and the same for all the images, and the noise is white. We exploit previous results to develop a new highly efficient super-resolution reconstruction algorithm for this case, which separates the treatment into de-blurring and measurements fusion. The fusion part is shown to be a very simple non-iterative algorithm, preserving the optimality of the entire reconstruction process, in the maximum-likelihood sense. Simulations demonstrate the capabilities of the proposed algorithm.
This paper suggests a discriminative approach for wavelet denoising where a set of mapping functions (MFs) are applied to the transform coefficients in an attempt to produce a noise free image. As opposed to the descriptive approaches, modeling image or noise priors is not required here and the MFs are learned directly from an ensemble of example images using least-squares fitting. The suggested scheme generates a novel set of MFs that are essentially different from the traditional soft/hard thresholding in the over-complete case. These MFs are demonstrated to obtain comparable performance to the state-of-the-art denoising approaches. Additionally, this framework enables a seamless customization of the shrinkage operation to a new set of restoration problems that were not addressed previously with shrinkage techniques, such as deblurring, JPEG artifact removal, and various types of additive noise that are not necessarily Gaussian white noise.
A fast pattern matching scheme termed matching by tone mapping (MTM) is introduced which allows matching under nonlinear tone mappings. We show that, when tone mapping is approximated by a piecewise constant/linear function, a fast computational scheme is possible requiring computational time similar to the fast implementation of normalized cross correlation (NCC). In fact, the MTM measure can be viewed as a generalization of the NCC for nonlinear mappings and actually reduces to NCC when mappings are restricted to be linear. We empirically show that the MTM is highly discriminative and robust to noise with comparable performance capability to that of the well performing mutual information, but on par with NCC in terms of computation time.
Abstract-In this paper, we introduce a family of filter kernels-the Gray-Code Kernels (GCK) and demonstrate their use in image analysis. Filtering an image with a sequence of Gray-Code Kernels is highly efficient and requires only two operations per pixel for each filter kernel, independent of the size or dimension of the kernel. We show that the family of kernels is large and includes the Walsh-Hadamard kernels, among others. The GCK can be used to approximate any desired kernel and, as such forms, a complete representation. The efficiency of computation using a sequence of GCK filters can be exploited for various real-time applications, such as, pattern detection, feature extraction, texture analysis, texture synthesis, and more.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.