Figure 1: Complex lens flare generated by a Canon zoom lens. Left: reference photos. Right: renderings generated using our technique at comparable settings. Even with many unknowns in the lens design and scene composition, as well as manufacturing tolerances in the real lens, the renderings closely reproduce the "personality" of the flare. AbstractLens flare is caused by light passing through a photographic lens system in an unintended way. Often considered a degrading artifact, it has become a crucial component for realistic imagery and an artistic means that can even lead to an increased perceived brightness. So far, only costly offline processes allowed for convincing simulations of the complex light interactions. In this paper, we present a novel method to interactively compute physically-plausible flare renderings for photographic lenses. The underlying model covers many components that are important for realism, such as imperfections, chromatic and geometric lens aberrations, and antireflective lens coatings. Various acceleration strategies allow for a performance/quality tradeoff, making our technique applicable both in real-time applications and in high-quality production rendering. We further outline artistic extensions to our system.
Figure 1: We introduce a new computational imaging system that allows for metric radial velocity information to be captured instantaneously for each pixel (center row). For this purpose, we design the temporal illumination and modulation frequencies of a time-of-flight camera (left) to be orthogonal within its exposure time. The Doppler effect of objects in motion is then detected as a frequency shift of the illumination, which results in a mapping from object velocity to recorded pixel intensity. By capturing a few coded time-of-flight measurements and adding a conventional RGB camera to the setup, we demonstrate that color, velocity, and depth information of a scene can be recorded simultaneously. The results above show several frames of two video sequences. For each example, the left-most frame shows a static object (velocity map is constant), which is then moved towards (positive radial velocity) or away from (negative velocity) the camera. AbstractOver the last few years, depth cameras have become increasingly popular for a range of applications, including human-computer interaction and gaming, augmented reality, machine vision, and medical imaging. Many of the commercially-available devices use the time-of-flight principle, where active illumination is temporally coded and analyzed in the camera to estimate a per-pixel depth map of the scene. In this paper, we propose a fundamentally new imaging modality for all time-of-flight (ToF) cameras: per-pixel radial velocity measurement. The proposed technique exploits the Doppler effect of objects in motion, which shifts the temporal illumination frequency before it reaches the camera. Using carefully coded illumination and modulation frequencies of the ToF camera, object velocities directly map to measured pixel intensities. We show that a slight modification of our imaging system allows for color, depth, and velocity information to be captured simultaneously. Combining the optical flow computed on the RGB frames with the measured metric radial velocity allows us to further estimate the full 3D metric velocity field of the scene. The proposed technique has applications in many computer graphics and vision problems, for example motion tracking, segmentation, recognition, and motion deblurring.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.