Abstract:Unmanned aerial vehicles (UAVs) represent a quickly evolving technology, broadening the availability of remote sensing tools to small-scale research groups across a variety of scientific fields. Development of UAV platforms requires broad technical skills covering platform development, data post-processing, and image analysis. UAV development is constrained by a need to balance technological accessibility, flexibility in application and quality in image data. In this study, the quality of UAV imagery acquired … Show more
“…Lens distortion is a radially dependent geometric shift or deviation from the rectilinear projection [46]. As SfM is very sensitive to distortion, camera calibration is a crucial pre-processing step [4,47].…”
Section: Suggested Improvements a Camera Pre-calibrationmentioning
confidence: 99%
“…This uses multiple images of a calibration grid, which is usually displayed on a flat panel screen and then imaged with the camera [46]. As this does not work with a thermal camera, the grid was printed on an A3 sheet of paper, which was fixed on a rigid wooden frame.…”
Section: Suggested Improvements a Camera Pre-calibrationmentioning
Abstract:The current standard procedure for aligning thermal imagery with structure-from-motion (SfM) software uses GPS logger data for the initial image location. As input data, all thermal images of the flight are rescaled to cover the same dynamic scale range, but they are not corrected for changes in meteorological conditions during the flight. This standard procedure can give poor results, particularly in datasets with very low contrast between and within images or when mapping very complex 3D structures. To overcome this, three alignment procedures were introduced and tested: camera pre-calibration, correction of thermal imagery for small changes in air temperature, and improved estimation of the initial image position by making use of the alignment of RGB (visual) images. These improvements were tested and evaluated in an agricultural (low temperature contrast data) and an afforestation (complex 3D structure) dataset. In both datasets, the standard alignment procedure failed to align the images properly, either by resulting in point clouds with several gaps (images that were not aligned) or with unrealistic 3D structure. Using initial thermal camera positions derived from RGB image alignment significantly improved thermal image alignment in all datasets. Air temperature correction had a small yet positive impact on image alignment in the low-contrast agricultural dataset, but a minor effect in the afforestation area. The effect of camera calibration on the alignment was limited in both datasets. Still, in both datasets, the combination of all three procedures significantly improved the alignment, in terms of number of aligned images and of alignment quality.
“…Lens distortion is a radially dependent geometric shift or deviation from the rectilinear projection [46]. As SfM is very sensitive to distortion, camera calibration is a crucial pre-processing step [4,47].…”
Section: Suggested Improvements a Camera Pre-calibrationmentioning
confidence: 99%
“…This uses multiple images of a calibration grid, which is usually displayed on a flat panel screen and then imaged with the camera [46]. As this does not work with a thermal camera, the grid was printed on an A3 sheet of paper, which was fixed on a rigid wooden frame.…”
Section: Suggested Improvements a Camera Pre-calibrationmentioning
Abstract:The current standard procedure for aligning thermal imagery with structure-from-motion (SfM) software uses GPS logger data for the initial image location. As input data, all thermal images of the flight are rescaled to cover the same dynamic scale range, but they are not corrected for changes in meteorological conditions during the flight. This standard procedure can give poor results, particularly in datasets with very low contrast between and within images or when mapping very complex 3D structures. To overcome this, three alignment procedures were introduced and tested: camera pre-calibration, correction of thermal imagery for small changes in air temperature, and improved estimation of the initial image position by making use of the alignment of RGB (visual) images. These improvements were tested and evaluated in an agricultural (low temperature contrast data) and an afforestation (complex 3D structure) dataset. In both datasets, the standard alignment procedure failed to align the images properly, either by resulting in point clouds with several gaps (images that were not aligned) or with unrealistic 3D structure. Using initial thermal camera positions derived from RGB image alignment significantly improved thermal image alignment in all datasets. Air temperature correction had a small yet positive impact on image alignment in the low-contrast agricultural dataset, but a minor effect in the afforestation area. The effect of camera calibration on the alignment was limited in both datasets. Still, in both datasets, the combination of all three procedures significantly improved the alignment, in terms of number of aligned images and of alignment quality.
“…The image preprocessing workflows followed Kelcey and Lucieer [41]. In short, the following three corrections were applied: (i) noise reduction using dark current imagery; (ii) lens vignetting correction based on spatially dependent correction factors; and (iii) removal of lens distortion with a modified Brown-Conrady model.…”
Unmanned aerial system (UAS)-based remote sensing is one promising technique for precision crop management, but few studies have reported the applications of such systems on nitrogen (N) estimation with multiple sensors in rice (Oryza sativa L.). This study aims to evaluate three sensors (RGB, color-infrared (CIR) and multispectral (MS) cameras) onboard UAS for the estimation of N status at individual stages and their combination with the field data collected from a two-year rice experiment. The experiments were conducted in 2015 and 2016, involving different N rates, planting densities and rice cultivars, with three replicates. An Oktokopter UAS was used to acquire aerial photography at early growth stages (from tillering to booting) and field samplings were taken at a near date. Two color indices (normalized excess green index (NExG), and normalized green red difference index (NGRDI)), two near infrared vegetation indices (green normalized difference vegetation index (GNDVI), and enhanced NDVI (ENDVI)) and two red edge vegetation indices (red edge chlorophyll index (CI red edge ), and DATT) were used to evaluate the capability of these three sensors in estimating leaf nitrogen accumulation (LNA) and plant nitrogen accumulation (PNA) in rice. The results demonstrated that the red edge vegetation indices derived from MS images produced the highest estimation accuracy for LNA (R 2 : 0.79-0.81, root mean squared error (RMSE): 1.43-1.45 g m ā2 ) and PNA (R 2 : 0.81-0.84, RMSE: 2.27-2.38 g m ā2 ). The GNDVI from CIR images yielded a moderate estimation accuracy with an all-stage model. Color indices from RGB images exhibited satisfactory performance for the pooled dataset of the tillering and jointing stages. Compared with the counterpart indices from the RGB and CIR images, the indices from the MS images performed better in most cases. These results may set strong foundations for the development of UAS-based rice growth monitoring systems, providing useful information for the real-time decision making on crop N management.
“…In the last five years, the image acquisition with UAVs exponentially increased (Colomina and Molina 2014;Zhang and Kovacs 2012). While multi-and hyperspectral sensors are still quite expensive and image acquisition and analysis requires a high expertise (Aasen et al 2015, Kelcey andLucieer 2012), it is nowadays even possible for lay persons to take georeferenced RGB images with a very high spatial resolution with low-cost UAVs, e.g. DIJ's Phantom 3 Pro (www.dji.com).…”
ABSTRACT:The development of UAV-based sensing systems for agronomic applications serves the improvement of crop management. The latter is in the focus of precision agriculture which intends to optimize yield, fertilizer input, and crop protection. Besides, in some cropping systems vehicle-based sensing devices are less suitable because fields cannot be entered from certain growing stages onwards. This is true for rice, maize, sorghum, and many more crops. Consequently, UAV-based sensing approaches fill a niche of very high resolution data acquisition on the field scale in space and time. While mounting RGB digital compact cameras to low-weight UAVs (< 5 kg) is well established, the miniaturization of sensors in the last years also enables hyperspectral data acquisition from those platforms. From both, RGB and hyperspectral data, vegetation indices (VIs) are computed to estimate crop growth parameters. In this contribution, we compare two different sensing approaches from a low-weight UAV platform (< 5 kg) for monitoring a nitrogen field experiment of winter wheat and a corresponding farmers' field in Western Germany. (i) A standard digital compact camera was flown to acquire RGB images which are used to compute the RGBVI and (ii) NDVI is computed from a newly modified version of the Yara N-Sensor. The latter is a well-established tractor-based hyperspectral sensor for crop management and is available on the market since a decade. It was modified for this study to fit the requirements of UAV-based data acquisition. Consequently, we focus on three objectives in this contribution: (1) to evaluate the potential of the uncalibrated RGBVI for monitoring nitrogen status in winter wheat, (2) investigate the UAV-based performance of the modified Yara N-Sensor, and (3) compare the results of the two different UAV-based sensing approaches for winter wheat.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citationsācitations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.