The specific requirements of UAV-photogrammetry necessitate particular solutions for system development, which have mostly been ignored or not assessed adequately in recent studies. Accordingly, this paper presents the methodological and experimental aspects of correctly implementing a UAV-photogrammetry system. The hardware of the system consists of an electric-powered helicopter, a high-resolution digital camera and an inertial navigation system. The software of the system includes the in-house programs specifically designed for camera calibration, platform calibration, system integration, on-board data acquisition, flight planning and on-the-job self-calibration. The detailed features of the system are discussed, and solutions are proposed in order to enhance the system and its photogrammetric outputs. The developed system is extensively tested for precise modeling of the challenging environment of an open-pit gravel mine. The accuracy of the results is evaluated under various mapping conditions, including direct georeferencing and indirect georeferencing with different numbers, distributions and types of ground control points. Additionally, the effects of imaging configuration and network stability on modeling accuracy are assessed. The experiments demonstrated that 1.55 m horizontal and 3.16 m vertical absolute modeling accuracy could be achieved via direct geo-referencing, which was improved to 0.4 cm and 1.7 cm after indirect geo-referencing.
The Ladybug5 is an integrated, multi-camera system that features a near spherical field of view. It is commonly deployed on mobile mapping systems to collect imagery for 3D reality capture. This paper describes an approach for the geometric modelling and self-calibration of this system. The collinearity equations of the pinhole camera model are augmented with five radial lens distortion terms to correct the severe barrel distortion. Weighted relative orientation stability constraints are added to the self-calibrating bundle adjustment solution to enforce the angular and positional stability between the Ladybug5's six cameras. Results are presented from two calibration datasets and an independent dataset for accuracy assessment. It is demonstrated that centimetre-level 3D reconstruction accuracy can be achieved with the proposed approach. Moreover, the effectiveness of the lens distortion modelling is demonstrated. Image-space precision and object-space accuracy are improved by 92% and 93%, respectively, relative to a two-term model. The high correlations between lens distortion coefficients were not found to be detrimental to the solution. The mechanical stability of the system was assessed by comparing calibrations taken before and after ten months of routine camera system use. The results suggest sub-pixel interior orientation stability and millimetre-level relative orientation stability. Analyses of accuracy and parameter correlation demonstrate that a slightly relaxed weighting strategy is preferred to tightly-enforced relative orientation stability constraints.
Time-of-flight cameras, based on Photonic Mixer Device (PMD) technology, are capable of measuring distances to objects at high frame rates, however, the measured ranges and the intensity data contain systematic errors that need to be corrected. In this paper, a new integrated range camera self-calibration method via joint setup with a digital (RGB) camera is presented. This method can simultaneously estimate the systematic range error parameters as well as the interior and external orientation parameters of the camera. The calibration approach is based on photogrammetric bundle adjustment of observation equations originating from collinearity condition and a range errors model. Addition of a digital camera to the calibration process overcomes the limitations of small field of view and low pixel resolution of the range camera. The tests are performed on a dataset captured by a PMD[vision]-O3 camera from a multi-resolution test field of high contrast targets. An average improvement of 83% in RMS of range error and 72% in RMS of coordinate residual, over that achieved with basic calibration, was realized in an independent accuracy assessment. Our proposed calibration method also achieved 25% and 36% improvement on RMS of range error and coordinate residual, respectively, over that obtained by integrated calibration of the single PMD camera.
ABSTRACT:Along with the advancement of unmanned aerial vehicles (UAVs), improvement of high-resolution cameras and development of vision-based mapping techniques, unmanned aerial imagery has become a matter of remarkable interest among researchers and industries. These images have the potential to provide data with unprecedented spatial and temporal resolution for three-dimensional (3D) modelling. In this paper, we present our theoretical and technical experiments regarding the development, implementation and evaluation of a UAV-based photogrammetric system for precise 3D modelling. This system was preliminarily evaluated for the application of gravel-pit surveying. The hardware of the system includes an electric powered helicopter, a 16-megapixels visible camera and inertial navigation system. The software of the system consists of the in-house programs built for sensor calibration, platform calibration, system integration and flight planning. It also includes the algorithms developed for structure from motion (SfM) computation including sparse matching, motion estimation, bundle adjustment and dense matching.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.