This paper presents two methods for fusion of infrared (IR) and visible surveillance images. The first method combines Curvelet Transform (CT) with Discrete Wavelet Transform (DWT). As wavelets do not represent long edges well while curvelets are challenged with small features, our objective is to combine both to achieve better performance. The second approach uses Discrete Wavelet Packet Transform (DWPT), which provides multiresolution in high frequency band as well and hence helps in handling edges better. The performance of the proposed methods have been extensively tested for a number of multimodal surveillance images and compared with various existing transform domain fusion methods. Experimental results show that evaluation based on entropy, gradient, contrast etc., the criteria normally used, are not enough, as in some cases, these criteria are not consistent with the visual quality. It also demonstrates that the Petrovic and Xydeas image fusion metric is a more appropriate criterion for fusion of IR and visible images, as in all the tested fused images, visual quality agrees with the Petrovic and Xydeas metric evaluation. The analysis shows that there is significant increase in the quality of fused image, both visually and quantitatively. The major achievement of the proposed fusion methods is its reduced artifacts, one of the most desired feature for fusion used in surveillance applications.
In this paper we present a spatial domain method to fuse multifocus images. Here the fused image pixel is selected from one of the source images based on a novel selection criterion that utilizes the statistical properties of the neighborhood. The eigen value of the unbiased estimate of the covariance matrix of an image block depends on the strength of edges in the block and thus provides a good base for selecting/rejecting a pixel, giving preference to the pixel with the higher eigen value and thus the sharper neighborhood. To prevent a noise pixel from getting selected as a fused image pixel, a continuity constraint is imposed on the selection criteria. The performance of the method have been extensively tested on several pairs of multifocus images and compared quantitatively with existing methods. Experimental results show that the proposed method improves fusion quality by reducing loss of information by almost 50% and noise by more than 95%. It also show that evaluation based on widely used criteria like entropy, gradient, deviation, may not be enough; as in some cases, these criteria are not consistent with the ground truth. It demonstrates that P etrovic metrics are in correlation with the ground truth as well as visual quality.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.