In this paper, we present an example-based colorization technique robust to illumination differences between grayscale target and color reference images. To achieve this goal, our method performs color transfer in an illumination-independent domain that is relatively free of shadows and highlights. It first recovers an illumination-independent
intrinsic reflectance image
of the target scene from multiple color references obtained by web search. The reference images from the web search may be taken from different vantage points, under different illumination conditions, and with different cameras. Grayscale versions of these reference images are then used in decomposing the grayscale target image into its intrinsic reflectance and illumination components. We transfer color from the color reflectance image to the grayscale reflectance image, and obtain the final result by relighting with the illumination component of the target image. We demonstrate via several examples that our method generates results with excellent color consistency.
a) Target grayscale image (b) Reference images from Internet (c) Colorized result Figure 1: Colorization of St. Basil's Cathedral: To colorize the target image in (a), we utilize the references searched from the Internet in (b). After recovering the intrinsic color of the target, we colorize to obtain the result in (c) without the influence of illumination and dynamic objects in the references.
AbstractIn this paper, we present an example-based colorization technique robust to illumination differences between grayscale target and color reference images. To achieve this goal, our method performs color transfer in an illumination-independent domain that is relatively free of shadows and highlights. It first recovers an illumination-independent intrinsic reflectance image of the target scene from multiple color references obtained by web search. The reference images from the web search may be taken from different vantage points, under different illumination conditions, and with different cameras. Grayscale versions of these reference images are then used in decomposing the grayscale target image into its intrinsic reflectance and illumination components. We transfer color from the color reflectance image to the grayscale reflectance image, and obtain the final result by relighting with the illumination component of the target image. We demonstrate via several examples that our method generates results with excellent color consistency.
source image source image animated motion of an elephant animated motion of a bird Figure 1: Given still pictures of animal groups, we can infer the motion of animals and perform realistic animations, such as (a) the animated motion of a bird and (b) the animated motion of an elephant. In (b), we also superimpose the adjacent frames to illustrate the motion.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.