Abstract. We present a new method for computing the change of light possibly occurring between two pictures of the same scene. We approximate the illuminant variation with the von Kries diagonal transform and estimate it by minimizing a functional that measures the divergence between the image color histograms. Our approach shows good performances in terms of accuracy of the illuminant change estimation and of robustness to pixel saturation and Gaussian noise. Moreover we illustrate how the method can be applied to solve the problem of illuminant invariant image recognition.
Light and ColorColor descriptors are considered among the most important features in contentbased image retrieval and indexing [15]. Colors are in fact robust to noise, rescaling, rotation and image resolution. The main drawback in the use of color for object and image retrieval is the strict dependency of the color on the light in the scene. Color variations can be produced in different ways, for instance by changing the number, the position or the spectrum of the light sources. Moreover, the color of a picture often depends on the characteristics of the device used to capture the scene. The development of a device-and illuminant-invariant image representation is an old but still unsolved attractive problem in Computer Vision [15]. In this paper, we propose a method for estimating the variation of illuminant between the images of a scene taken under different light conditions. More precisely, we restrict our attention to the photometric changes induced by different kinds of lamps or by variations in the voltage of the lamps illuminating a scene. We assume that the illumination varies uniformly over the whole image and we assume the von Kries diagonal model, in which the responses of a camera sensor under two different illuminants are related by a diagonal linear transformation. This model has been proved to be a good approximation for illuminant changes [6], [7], especially in the case of narrow-band sensory systems [4], and it is employed in many color enhancement techniques, e.g. [2], [5], [3], [16]. Our technique estimates the von Kries transform between an image and a re-illuminated version of it by a least-squares method that minimizes a dissimilarity measure, named divergence, between their color histograms. The accuracy of the estimate obtained by our method has been measured on synthetic and real-world datasets,