Outdoor images in sand-dust environments play an adverse role in various remote-based computer vision tasks because captured sand-dust images have severe color casts, low contrast, and poor visibility. However, although sand-dust image restoration is as important as haze removal and underwater image enhancement, it has not been sufficiently studied. In this paper, we present a novel color balance algorithm for sand-dust image enhancement. The aim of the proposed enhancement method is to obtain a coincident chromatic histogram. First, we introduce a pixel-adaptive color correction method using the mean and standard deviation of chromatic histograms. Pixels of each color component are adjusted based on the statistical characteristics of the green component. Second, a green-mean-preserving color normalization technique is presented. However, using the mean of red and blue components as the mean of the green can result in an undesirable output because the red or blue components of many sand-dust images have a narrow histogram with a high peak. To address this problem, we propose a histogram shifting algorithm that makes the red and blue histograms overlap the green histogram as much as possible. Based on this algorithm, bluish or reddish artifacts of the enhanced image can be reduced. Finally, image adjustment is exploited to improve the brightness of the sand-dust image. We performed intensive experiments for various sand-dust images and compared the performance of the proposed method with those of state-of-the-art enhancement methods. The simulation results indicate that the proposed enhancement scheme outperforms the existing approaches in terms of both subjective and objective qualities. INDEX TERMS Sand-dust image enhancement, color normalization, green-mean preserving, maximum overlapped histogram, coincident chromatic histogram.
In this paper, we propose an image splicing detecting method using the characteristic function moments for the inter-scale co-occurrence matrix in the wavelet domain. We construct the co-occurrence matrices by using a pair of wavelet difference values across inter-scale wavelet subbands. In this process, we do not adopt the thresholding operation to prevent information loss. We extract the high-order characteristic function moments of the two-dimensional joint density function generated by the inter-scale co-concurrent matrices in order to detect image splicing forgery. Our method can be applied regardless of the color or gray image dataset using only luminance component of an image. By performing experimental simulations, we demonstrate that the proposed method achieves good performance in splicing detection. Our results show that the detection accuracy was greater than 95 % on average with well-known four splicing detection image datasets.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.