2019
DOI: 10.3390/rs11151811
|View full text |Cite
|
Sign up to set email alerts
|

New Strategies for Time Delay Estimation during System Calibration for UAV-Based GNSS/INS-Assisted Imaging Systems

Abstract: The need for accurate 3D spatial information is growing rapidly in many of today’s key industries, such as precision agriculture, emergency management, infrastructure monitoring, and defense. Unmanned aerial vehicles (UAVs) equipped with global navigation satellite systems/inertial navigation systems (GNSS/INS) and consumer-grade digital imaging sensors are capable of providing accurate 3D spatial information at a relatively low cost. However, with the use of consumer-grade sensors, system calibration is criti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(14 citation statements)
references
References 35 publications
0
13
0
Order By: Relevance
“…Both the RGB camera and the VLP-16 sensor are mounted on a DJI Matrice 600 Pro (M600P) platform. Spatial and temporal system calibration for the datasets used in this study were conducted using the approaches described in [45] and [46], respectively. Additionally, the georeferenced orthomosaics were generated using the structure from motion strategies introduced in [47,48].…”
Section: Remote Sensing Datamentioning
confidence: 99%
“…Both the RGB camera and the VLP-16 sensor are mounted on a DJI Matrice 600 Pro (M600P) platform. Spatial and temporal system calibration for the datasets used in this study were conducted using the approaches described in [45] and [46], respectively. Additionally, the georeferenced orthomosaics were generated using the structure from motion strategies introduced in [47,48].…”
Section: Remote Sensing Datamentioning
confidence: 99%
“…In this strategy, the mounting parameters are simultaneously estimated through minimizing the discrepancies among linear/planar features and conjugate points extracted from LiDAR point clouds and images from different flight lines. Also, similar to the approach proposed in Reference [31], a time offset calibration is done to solve and correct for any possible time delay between the actual camera exposure and recorded event marker by the GNSS/INS unit. This approach modifies the collinearity equations so that the time delay can be directly estimated in the bundle adjustment process.…”
Section: Data Acquisition Systemmentioning
confidence: 99%
“…Spatial system calibration parameters include internal characteristics of the onboard camera(s), known as interior orientation parameters (IOPs), as well as mounting parameters which describe the differences in the position and orientation between the GNSS/INS body frame and camera(s) frame [28][29][30]. On the other hand, temporal system calibration aims at solving and correcting for any possible time delay in the synchronization between the GNSS/INS unit and the camera(s) onboard the UAV system [31,32].…”
Section: Introductionmentioning
confidence: 99%
“…Establishing such ground targets is expensive and labor intensive, and more importantly, the distribution and number of GCPs are usually less than optimal to provide adequate control for determining system calibration parameters. To overcome this limitation, there has also been recent research focusing on UAV-based RGB camera [9] and hyperspectral scanner system calibration [10] without the need for GCPs. Yet, even without using GCPs, those techniques have only been applied using manually-measured tie points in overlapping images.…”
Section: Introductionmentioning
confidence: 99%
“…For a robust estimation of system calibration parameters with minimum correlation between such parameters, specific flight missions are required. As discussed in [9], to conduct spatial and temporal calibration, it is recommended to derive the system parameters using opposite flying directions at different flying heights, as well as having a variation in the linear and angular velocities. In addition to potential differences among overlapping UAV images such as changes in illumination, viewpoint, and distortions, data acquisition from multiple flying heights results in imagery with varying scale.…”
Section: Introductionmentioning
confidence: 99%