UAV Photogrammetry today already enjoys a largely automated and efficient data processing pipeline. However, the goal of dispensing with Ground Control Points looks closer, as dual-frequency GNSS receivers are put on board. This paper reports on the accuracy in object space obtained by GNSS-supported orientation of four photogrammetric blocks, acquired by a senseFly eBee RTK and all flown according to the same flight plan at 80 m above ground over a test field. Differential corrections were sent to the eBee from a nearby ground station. Block orientation has been performed with three software packages: PhotoScan, Pix4D and MicMac. The influence on the checkpoint errors of the precision given to the projection centers has been studied: in most cases, values in Z are critical. Without GCP, the RTK solution consistently achieves a RMSE of about 2-3 cm on the horizontal coordinates of checkpoints. In elevation, the RMSE varies from flight to flight, from 2 to 10 cm. Using at least one GCP, with all packages and all test flights, the geocoding accuracy of GNSS-supported orientation is almost as good as that of a traditional GCP orientation in XY and only slightly worse in Z.
High-resolution Digital Surface Models (DSMs) from unmanned aerial vehicles (UAVs) imagery with accuracy better than 10 cm open new possibilities in geosciences and engineering. The accuracy of such DSMs depends on the number and distribution of ground control points (GCPs). Placing and measuring GCPs are often the most time-consuming on-site tasks in a UAV project. Safety or accessibility concerns may impede their proper placement, so either costlier techniques must be used, or a less accurate DSM is obtained. Photogrammetric blocks flown by drones with on-board receivers capable of RTK (real-time kinematic) positioning do not need GCPs, as camera stations at exposure time can be determined with cm-level accuracy, and used to georeference the block and control its deformations. This paper presents an experimental investigation on the repeatability of DSM generation from several blocks acquired with a RTK-enabled drone, where differential corrections were sent from a local master station or a network of Continuously Operating Reference Stations (CORS). Four different flights for each RTK mode were executed over a test field, according to the same flight plan. DSM generation was performed with three block control configurations: GCP only, camera stations only, and with camera stations and one GCP. The results show that irrespective of the RTK mode, the first and third configurations provide the best DSM inner consistency. The average range of the elevation discrepancies among the DSMs in such cases is about 6 cm (2.5 GSD, ground sampling density) for a 10-cm resolution DSM. Using camera stations only, the average range is almost twice as large (4.7 GSD). The average DSM accuracy, which was verified on checkpoints, turned out to be about 2.1 GSD with the first and third configurations, and 3.7 GSD with camera stations only.
LIDAR (LIght Detection And Ranging) data are a primary data source for digital terrain model (DTM) generation and 3D city models. This paper presents a three-stage framework for a robust automatic classification of raw LIDAR data as buildings, ground and vegetation, followed by a reconstruction of 3D models of the buildings. In the first stage the raw data are filtered and interpolated over a grid. In the second stage, first a double raw data segmentation is performed and then geometric and topological relationships among regions resulting from segmentation are computed and stored in a knowledge base. In the third stage, a rule-based scheme is applied for the classification of the regions. Finally, polyhedral building models are reconstructed by analysing the topology of building outlines, building roof slopes and eaves lines. Results obtained on data sets with different ground point density, gathered over the town of Pavia (Italy) with Toposys and Optech airborne laser scanning systems, are shown to illustrate the effectiveness of the proposed approach
The so-called Real Time Kinematic (RTK) option, which allows one to determine with cm-level accuracy the Unmanned Aerial Vehicles (UAV) camera position at shooting time, is also being made available on medium- or low-cost drones. It can be foreseen that a sizeable amount of UAV surveys will be soon performed (almost) without Ground Control Points (GCP). However, obstacles to Global Navigation Satellite Systems (GNSS) signal at the optimal flight altitude might prevent accurate retrieval of camera station positions, e.g., in narrow gorges. In such cases, the master block can be georeferenced by tying it to an (auxiliary) block flown at higher altitude, where the GNSS signal is not impeded. To prove the point in a worst case scenario, but under controlled conditions, an experiment was devised. A single strip about 700 m long, surveyed by a multi-copter at 30 m relative flight height, was referenced with cm-level accuracy by joint adjustment with a block flown at 100 m relative flight height, acquired by a fixed-wing UAV provided with RTK option. The joint block orientation was repeated with or without GCP and with pre-calibrated or self-calibrated camera parameters. Accuracy on ground was assessed on a fair number of Check Points (CP). The results show that, even without GCP, the precision is effectively transferred from the auxiliary block projection centres to the object point horizontal coordinates and, with a pre-calibrated camera, also to the elevations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.