Time-of-flight range imaging systems illuminate a scene with an amplitude-modulated light source, the light is reflected from objects in the scene, and measurement of the phase of the modulation envelope is performed to determine the object's distance. As the image sensor is capable of performing this task for every pixel simultaneously, acquisition of the entire scene can be performed at rapid (video) update rates, making the technology ideal for real-time applications. In this paper we present an efficient real-time FPGA algorithm for determining phase, and hence distance, from the raw image sensor output. The algorithm has been implemented on a range imaging system based on the PMD19k-2 image sensor, with range processing performed in real time by a Stratix III FPGA. The scarcest resource in this implementation is RAM, and an analysis is presented to maximise the efficiency of this resource whilst maintaining acceptable processing accuracy. The algorithm can be extended for processing multiple simultaneous modulation frequencies. An efficient method for combining these results to determine unambiguous range, based on the Chinese remainder theorem, is also presented.Keywords 3D-Imaging Á FPGA Á Time-of-flight Á Range imaging Á Chinese remainder theorem
BackgroundA standard image provides only a 2-D representation of a scene. In many applications this provides insufficient data. Range imaging combines a conventional 2-D image with the distance of each point from the camera, giving a 3-D image. An increasingly popular technique for capturing 3-D images is to utilize an active illumination source and image sensor to measure the time-of-flight (ToF) of light from the camera system to and from the scene simultaneously for every pixel of the sensor [1][2][3][4][5]. The processing required to extract range data from the raw pixel intensity information requires a number of image frames to be stored and temporally processed. For real-time applications such as mobile robotics [6,7] or gaming [8] it is desirable for the imaging system itself to process these frames in hardware, thereby reducing the load on the higher level processor.The basic principle of ToF range imaging is that a light signal is transmitted from the camera, and the time it takes to be reflected off the object and return to the camera will be proportional to the distance travelled, d, bywhere t is the time taken and c is the speed of light.Since it is difficult to directly measure the time of flight of a large number of pixels simultaneously, one approach is to amplitude modulate the light source with a sinusoid, and measure the phase shift of the modulation envelope, h. The distance for each pixel can then be calculated by