Lately, unmanned aerial vehicle (UAV) become a prominent technology in remote sensing studies with the advantage of high-resolution, low-cost, rapidly and periodically achievable three-dimensional (3D) data. UAV enables data capturing in different flight altitudes, imaging geometries, and viewing angles which make detailed monitoring and modelling of target objects possible. Against earlier times, UAVs have been improved by integrating real-time kinematic (RTK) positioning and multispectral (MS) imaging equipment. In this study, positioning accuracy and land cover classification potential of RTK equipped MS UAVs were evaluated by point-based geolocation accuracy analysis and pixel-based ensemble learning algorithms. In positioning accuracy evaluation, ground control points (GCPs), pre-defined by terrestrial global navigation satellite system (GNSS) measurements, were used as the reference data while Random Forest (RF) and Extreme Gradient Boosting (XGBoost) algorithms were applied for land cover classification. In addition, the spectral signatures of some major land classes, achieved by UAV MS bands, were compared with reference terrestrial spectro-radiometer measurements. The results demonstrated that the positioning accuracy of MS RTK UAV is ±1.1 cm in X, ±2.7 cm in Y, and ±5.7 cm in Z as root mean square error (RMSE). In RF and XGBoost pixel-based land cover classification, 13 independent land cover classes were detected with overall accuracies and kappa statistics of 93.14% and 93.37%, 0.92 and 0.93, respectively.
In the last decades, developments in game engine technology led to a raised attraction to the virtual reality (VR) and augmented reality (AR) concepts which offer users an interactive synthetic environment. Also, with the travel limitations of the current COVID-19 pandemic, VR tour applications that visualize the geospatial data gained popularity more than ever. In this study, a three-dimensional (3D) VR tour application was created for Gebze Technical University (GTU) Campus by integrating unmanned aerial vehicle (UAV) data into an artificial environment by using cross-platform game development engine Unity. For creating high-quality 3D models of the Campus, different imaging geometries and flight altitudes were applied. The aerial photos were achieved with a ground sampling distance (GSD) of ≤2.2 cm with a 20 megapixel (MP) Sony Exmor RGB camera. Point cloud processing and the generation of high-quality 3D products were carried out by structure from motion (SfM) based photogrammetric software Agisoft Metashape. Using 86 well-distributed ground control points (GCPs), geometric correction accuracy of ±2 cm (~0.9 pixels) was reached as root mean square error (RMSE). Generated 3D models were imported into the Unity environment and the negative influence of high polygon data on the application performance was reduced by applying occlusion culling and space subdivision rendering optimization algorithms. The visual potential of the VR was improved by adding 3D individual object models such as trees, benches and arbors. For enhancing the information content of the VR tour, interactive information panels including the building metadata such as building name, block name and total floor area were placed. Finally, a first-person player was implemented for a realistic VR experience.
Abstract. Lately, improvements in game engines have increased the interest in virtual reality (VR) technologies, that engages users with an artificial environment, and have led to the adoption of VR systems to display geospatial data. Because of the ongoing COVID-19 pandemic, and thus the necessity to stay at home, VR tours became very popular. In this paper, we tried to create a three-dimensional (3D) virtual tour for Gebze Technical University (GTU) Southern Campus by transferring high-resolution unmanned air vehicle (UAV) data into a virtual domain. UAV data is preferred in various applications because of its high spatial resolution, low cost and fast processing time. In this application, the study area was captured from different modes and altitudes of UAV flights with a minimum ground sampling distance (GSD) of 2.18 cm using a 20 MP digital camera. The UAV data was processed in Structure from Motion (SfM) based photogrammetric evaluation software Agisoft Metashape and high-quality 3D textured mesh models were generated. Image orientation was completed using an optimal number of ground control points (GCPs), and the geometric accuracy was calculated as ±8 mm (~0.4 pixels). To create the VR tour, UAV-based mesh models were transferred into the Unity game engine and optimization processes were carried out by applying occlusion culling and space subdivision algorithms. To improve the visualization, 3D object models such as trees, lighting poles and arbours were positioned on VR. Finally, textual metadata about buildings and a player with a first-person camera were added for an informative VR experience.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.