An important field of reasearch in computer vision is the 3D analysis and reconstruction of objects and scenes. A rather new technologie in this context is the Photonic Mixer Device (PMD), based on the time-of-flight principle, which measures full-range distance information in real-time. Unfortunately, PMD-based devices have still limited resolution and provide only IR intensity information.This paper describes a fast algorithmic approach to combine high resolution RGB images with PMD distance data, acquired using a binocular camera setup. The resulting combined RGBZ-data not only enhances the visual result, but also represents a basis for advanced data processing in e. g. object recognition with sub-pixel accuracy. A simple but efficient method is used to detect geometric occlusion caused by the binocular setup, which otherwise will lead to false color assignments.Additionally, we introduce an enhanced filtering technique used for the edge-enhanced distance refinement of the geometry provided by the PMD camera. The technique incorporates a proper handling of boundaries and an iterative refinement approach, which can be used to enhance the 2D/3D-fusion accuracy.
Simulation of time-of-flight (ToF) sensors has mainly been used to evaluate depth data processing algorithms, and existing approaches, therefore, focus on the generation of realistic depth data. Thus, current approaches are of limited usefulness for studying alternatives in sensor chip design, since this application area has different requirements. We propose a new physically based simulation model with a focus on realistic and practical sensor parameterization. The model is suitable for implementation on massively parallel processors such as graphics processing units, to allow fast simulation of many sensor frames across a wide range of parameter sets for meaningful evaluation. We use our implementation to evaluate two alternative approaches in continuous-wave ToF sensor design.
In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.
Abstract-Synthetic Aperture Radar data presents specific problems for interactive visualization. The high amount of multiplicative speckle noise has to be reduced. The high dynamic range of the amplitude data must be mapped to the lower dynamic range of display devices in a way that makes image features appropriately visible. In addition to interactive navigation in the data, it is desirable to allow interactive selection of despeckling and dynamic range reduction methods and adjustment of their parameters.Graphics processing units (GPUs) can be seen as ubiquitous parallel coprocessors with extreme computational power. In this paper, we propose a GPU-based framework for interactive visualization of SAR data. Data management techniques are used to make full use of the GPU. We reworked well-known despeckling and dynamic range reduction techniques for the GPU programming model and implemented them in our framework. Both navigation in large data sets and adjustment of processing parameters are fully interactive.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.