Lodging has been recognized as one of the major destructive factors for crop quality and yield, resulting in an increasing need to develop cost-efficient and accurate methods for detecting crop lodging in a routine manner. Using structure-from-motion (SfM) and novel geospatial computing algorithms, this study investigated the potential of high resolution imaging with unmanned aircraft system (UAS) technology for detecting and assessing lodging severity over an experimental maize field at the Texas A&M AgriLife Research and Extension Center in Corpus Christi, Texas, during the 2016 growing season. The method was proposed to not only detect the occurrence of lodging at the field scale, but also to quantitatively estimate the number of lodged plants and the lodging rate within individual rows. Nadir-view images of the field trial were taken by multiple UAS platforms equipped with consumer grade red, green, and blue (RGB), and near-infrared (NIR) cameras on a routine basis, enabling a timely observation of the plant growth until harvesting. Models of canopy structure were reconstructed via an SfM photogrammetric workflow. The UAS-estimated maize height was characterized by polygons developed and expanded from individual row centerlines, and produced reliable accuracy when compared against field measures of height obtained from multiple dates. The proposed method then segmented the individual maize rows into multiple grid cells and determined the lodging severity based on the height percentiles against preset thresholds within individual grid cells. From the analysis derived from this method, the UAS-based lodging results were generally comparable in accuracy to those measured by a human data collector on the ground, measuring the number of lodging plants (R 2 = 0.48) and the lodging rate (R 2 = 0.50) on a per-row basis. The results also displayed a negative relationship of ground-measured yield with UAS-estimated and ground-measured lodging rate.
Core Ideas We comprehensively validated the use of UAS in sorghum and maize breeding programs. Temporal estimates of plant growth will allow researchers to elucidate new phenotypes. The stage of the breeding pipeline dictates the applicability of UAS platforms. The implementation of UAS is demonstrated in different crop species. Monetary and time costs should be considered before implementation of UAS. To meet future world food and fiber demands, plant breeders must increase the rate of genetic improvement of important agricultural crops. One of the biggest obstacles now facing crop scientists is a phenotyping bottleneck. To ease this burden, the emerging technology of unmanned aerial systems (UAS) presents an exciting opportunity. To assess the utility of UAS, it is important to investigate their application across multiple crop species. Terminal plant height is of great importance to maize (Zea mays L.) and sorghum [Sorghum bicolor (L.) Moench] breeders and has been hypothesized to be useful but has been logistically impractical to measure in the field. In this study, we statistically analyzed in depth the ability of UAS to estimate height in sorghum (advanced and early generation material) and maize (optimal and late material) and the application of these estimates in breeding programs. We found that UAS explain genotypic variation similarly to ground‐truth methods and that the repeatability of the methodology is high (R = 0.61–0.99), indicating effective differentiation of genotypes. Additionally, correlations between ground truth and UAS measurements were moderate to high for all materials (r = 0.4–0.9). Finally, we present a novel application for the technology in the form of high‐resolution temporal growth curves. Using these UAS‐generated growth curves, new physiological insights can be obtained and new avenues of scientific investigation are possible.
"Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment," J. Appl. Remote Sens. 11(2), 026035 (2017), doi: 10.1117/1.JRS.11.026035. Abstract. A small, fixed-wing unmanned aircraft system (UAS) was used to survey a replicated small plot field experiment designed to estimate sorghum damage caused by an invasive aphid. Plant stress varied among 40 plots through manipulation of aphid densities. Equipped with a consumer-grade near-infrared camera, the UAS was flown on a recurring basis over the growing season. The raw imagery was processed using structure-from-motion to generate normalized difference vegetation index (NDVI) maps of the fields and three-dimensional point clouds. NDVI and plant height metrics were averaged on a per plot basis and evaluated for their ability to identify aphid-induced plant stress. Experimental soil signal filtering was performed on both metrics, and a method filtering low near-infrared values before NDVI calculation was found to be the most effective. UAS NDVI was compared with NDVI from sensors onboard a manned aircraft and a tractor. The correlation results showed dependence on the growth stage. Plot averages of NDVI and canopy height values were compared with per-plot yield at 14% moisture and aphid density. The UAS measures of plant height and NDVI were correlated to plot averages of yield and insect density. Negative correlations between aphid density and NDVI were seen near the end of the season in the most damaged crops.
Deep learning has already been proved as a powerful state-of-the-art technique for many image understanding tasks in computer vision and other applications including remote sensing (RS) image analysis. Unmanned aircraft systems (UASs) offer a viable and economical alternative to a conventional sensor and platform for acquiring high spatial and high temporal resolution data with high operational flexibility. Coastal wetlands are among some of the most challenging and complex ecosystems for land cover prediction and mapping tasks because land cover targets often show high intra-class and low inter-class variances. In recent years, several deep convolutional neural network (CNN) architectures have been proposed for pixel-wise image labeling, commonly called semantic image segmentation. In this paper, some of the more recent deep CNN architectures proposed for semantic image segmentation are reviewed, and each model’s training efficiency and classification performance are evaluated by training it on a limited labeled image set. Training samples are provided using the hyper-spatial resolution UAS imagery over a wetland area and the required ground truth images are prepared by manual image labeling. Experimental results demonstrate that deep CNNs have a great potential for accurate land cover prediction task using UAS hyper-spatial resolution images. Some simple deep learning architectures perform comparable or even better than complex and very deep architectures with remarkably fewer training epochs. This performance is especially valuable when limited training samples are available, which is a common case in most RS applications.
This paper examines the potential use of fire extinguishing balls as part of a proposed system, where drone and remote-sensing technologies are utilized cooperatively as a supplement to traditional firefighting methods. The proposed system consists of (1) scouting unmanned aircraft system (UAS) to detect spot fires and monitor the risk of wildfire approaching a building, fence, and/or firefighting crew via remote sensing, (2) communication UAS to establish and extend the communication channel between scouting UAS and fire-fighting UAS, and (3) a fire-fighting UAS autonomously traveling to the waypoints to drop fire extinguishing balls (environmental friendly, heat activated suppressants). This concept is under development through a transdisciplinary multi-institutional project. The scope of this paper encloses general illustration of this design, and the experiments conducted so far to evaluate fire extinguishing balls. The results of the experiments show that smaller size fire extinguishing balls available in the global marketplace attached to drones might not be effective in aiding in building fires (unless there are open windows in the buildings already). On the contrary, results show that even the smaller size fire extinguishing balls might be effective in extinguishing short grass fires (around 0.5 kg size ball extinguished a circle of 1-meter of short grass). This finding guided the authors towards wildfire fighting rather than building fires. The paper also demonstrates building of heavy payload drones (around 15 kg payload), and the progress of development of an apparatus carrying fire-extinguishing balls attachable to drones.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.