Background Aboveground biomass (AGB) is a widely used agronomic parameter for characterizing crop growth status and predicting grain yield. The rapid and accurate estimation of AGB in a non-destructive way is useful for making informed decisions on precision crop management. Previous studies have investigated vegetation indices (VIs) and canopy height metrics derived from Unmanned Aerial Vehicle (UAV) data to estimate the AGB of various crops. However, the input variables were derived either from one type of data or from different sensors on board UAVs. Whether the combination of VIs and canopy height metrics derived from a single low-cost UAV system can improve the AGB estimation accuracy remains unclear. This study used a low-cost UAV system to acquire imagery at 30 m flight altitude at critical growth stages of wheat in Rugao of eastern China. The experiments were conducted in 2016 and 2017 and involved 36 field plots representing variations in cultivar, nitrogen fertilization level and sowing density. We evaluated the performance of VIs, canopy height metrics and their combination for AGB estimation in wheat with the stepwise multiple linear regression (SMLR) and three types of machine learning algorithms (support vector regression, SVR; extreme learning machine, ELM; random forest, RF). Results Our results demonstrated that the combination of VIs and canopy height metrics improved the estimation accuracy for AGB of wheat over the use of VIs or canopy height metrics alone. Specifically, RF performed the best among the SMLR and three machine learning algorithms regardless of using all the original variables or selected variables by the SMLR. The best accuracy ( R 2 = 0.78, RMSE = 1.34 t/ha, rRMSE = 28.98%) was obtained when applying RF to the combination of VIs and canopy height metrics. Conclusions Our findings implied that an inexpensive approach consisting of the RF algorithm and the combination of RGB imagery and point cloud data derived from a low-cost UAV system at the consumer-grade level can be used to improve the accuracy of AGB estimation and have potential in the practical applications in the rapid estimation of other growth parameters.
Leaf area index (LAI) is a significant biophysical variable in the models of hydrology, climatology and crop growth. Rapid monitoring of LAI is critical in modern precision agriculture. Remote sensing (RS) on satellite, aerial and unmanned aerial vehicles (UAVs) has become a popular technique in monitoring crop LAI. Among them, UAVs are highly attractive to researchers and agriculturists. However, some of the UAVs vegetation index (VI)-derived LAI models have relatively low accuracy because of the limited number of multispectral bands, especially as they tend to saturate at the middle to high LAI levels, which are the LAI levels of high-yielding wheat crops in China. This study aims to effectively estimate wheat LAI with UAVs narrowband multispectral image (400-800 nm spectral regions, 10 cm resolution) under varying growth conditions during five critical growth stages, and to provide the potential technical support for optimizing the nitrogen fertilization. Results demonstrated that the newly developed LAI model with modified triangular vegetation index (MTVI 2 ) has better accuracy with higher coefficient of determination (R c 2 = 0.79, R v 2 = 0.80) and lower relative root mean squared error (RRMSE = 24%), and higher sensitivity under various LAI values (from 2 to 7), which will broaden the applied range of the new LAI model. Furthermore, this LAI model displayed stable performance under different sub-categories of growth stages, varieties, and eco-sites. In conclusion, this study could provide effective technical support to precisely monitor the crop growth with UAVs in various crop yield levels, which should prove helpful in family farm for the modern agriculture.
Leaf area index (LAI) is a fundamental indicator of plant growth status in agronomic and environmental studies. Due to rapid advances in unmanned aerial vehicle (UAV) and sensor technologies, UAV-based remote sensing is emerging as a promising solution for monitoring crop LAI with great flexibility and applicability. This study aimed to determine the feasibility of combining color and texture information derived from UAV-based digital images for estimating LAI of rice (Oryza sativa L.). Rice field trials were conducted at two sites using different nitrogen application rates, varieties, and transplanting methods during 2016 to 2017. Digital images were collected using a consumer-grade UAV after sampling at key growth stages of tillering, stem elongation, panicle initiation and booting. Vegetation color indices (CIs) and grey level co-occurrence matrix-based textures were extracted from mosaicked UAV ortho-images for each plot. As a solution of using indices composed by two different textures, normalized difference texture indices (NDTIs) were calculated by two randomly selected textures. The relationships between rice LAIs and each calculated index were then compared using simple linear regression. Multivariate regression models with different input sets were further used to test the potential of combining CIs with various textures for rice LAI estimation. The results revealed that the visible atmospherically resistant index (VARI) based on three visible bands and the NDTI based on the mean textures derived from the red and green bands were the best for LAI retrieval in the CI and NDTI groups, respectively. Independent accuracy assessment showed that random forest (RF) exhibited the best predictive performance when combining CI and texture inputs (R2 = 0.84, RMSE = 0.87, MAE = 0.69). This study introduces a promising solution of combining color indices and textures from UAV-based digital imagery for rice LAI estimation. Future studies are needed on finding the best operation mode, suitable ground resolution, and optimal predictive methods for practical applications.
Unmanned aerial system (UAS)-based remote sensing is one promising technique for precision crop management, but few studies have reported the applications of such systems on nitrogen (N) estimation with multiple sensors in rice (Oryza sativa L.). This study aims to evaluate three sensors (RGB, color-infrared (CIR) and multispectral (MS) cameras) onboard UAS for the estimation of N status at individual stages and their combination with the field data collected from a two-year rice experiment. The experiments were conducted in 2015 and 2016, involving different N rates, planting densities and rice cultivars, with three replicates. An Oktokopter UAS was used to acquire aerial photography at early growth stages (from tillering to booting) and field samplings were taken at a near date. Two color indices (normalized excess green index (NExG), and normalized green red difference index (NGRDI)), two near infrared vegetation indices (green normalized difference vegetation index (GNDVI), and enhanced NDVI (ENDVI)) and two red edge vegetation indices (red edge chlorophyll index (CI red edge ), and DATT) were used to evaluate the capability of these three sensors in estimating leaf nitrogen accumulation (LNA) and plant nitrogen accumulation (PNA) in rice. The results demonstrated that the red edge vegetation indices derived from MS images produced the highest estimation accuracy for LNA (R 2 : 0.79-0.81, root mean squared error (RMSE): 1.43-1.45 g m −2 ) and PNA (R 2 : 0.81-0.84, RMSE: 2.27-2.38 g m −2 ). The GNDVI from CIR images yielded a moderate estimation accuracy with an all-stage model. Color indices from RGB images exhibited satisfactory performance for the pooled dataset of the tillering and jointing stages. Compared with the counterpart indices from the RGB and CIR images, the indices from the MS images performed better in most cases. These results may set strong foundations for the development of UAS-based rice growth monitoring systems, providing useful information for the real-time decision making on crop N management.
Leaf area index (LAI) and leaf dry matter (LDM) are important indices of crop growth. Real-time, nondestructive monitoring of crop growth is instructive for the diagnosis of crop growth and prediction of grain yield. Unmanned aerial vehicle (UAV)-based remote sensing is widely used in precision agriculture due to its unique advantages in flexibility and resolution. This study was carried out on wheat trials treated with different nitrogen levels and seeding densities in three regions of Jiangsu Province in 2018–2019. Canopy spectral images were collected by the UAV equipped with a multi-spectral camera during key wheat growth stages. To verify the results of the UAV images, the LAI, LDM, and yield data were obtained by destructive sampling. We extracted the wheat canopy reflectance and selected the best vegetation index for monitoring growth and predicting yield. Simple linear regression (LR), multiple linear regression (MLR), stepwise multiple linear regression (SMLR), partial least squares regression (PLSR), artificial neural network (ANN), and random forest (RF) modeling methods were used to construct a model for wheat yield estimation. The results show that the multi-spectral camera mounted on the multi-rotor UAV has a broad application prospect in crop growth index monitoring and yield estimation. The vegetation index combined with the red edge band and the near-infrared band was significantly correlated with LAI and LDM. Machine learning methods (i.e., PLSR, ANN, and RF) performed better for predicting wheat yield. The RF model constructed by normalized difference vegetation index (NDVI) at the jointing stage, heading stage, flowering stage, and filling stage was the optimal wheat yield estimation model in this study, with an R2 of 0.78 and relative root mean square error (RRMSE) of 0.1030. The results provide a theoretical basis for monitoring crop growth with a multi-rotor UAV platform and explore a technical method for improving the precision of yield estimation.
The accurate estimation of aboveground biomass (AGB) and leaf area index (LAI) is critical to characterize crop growth status and predict grain yield. Unmanned aerial vehicle (UAV) -based remote sensing has attracted significant interest due to its high flexibility and easiness of operation. The mixed effect model introduced in this study can capture secondary factors that cannot be captured by standard empirical relationships. The objective of this study was to explore the potential benefit of using a linear mixed-effect (LME) model and multispectral images from a fixed-wing UAV to estimate both AGB and LAI of rice. Field experiments were conducted over two consecutive years (2017–2018), that involved different N rates, planting patterns and rice cultivars. Images were collected by a compact multispectral camera mounted on a fixed-wing UAV during key rice growth stages. LME, simple regression (SR), artificial neural networks (ANN) and random forests (RF) models were developed relating growth parameters (AGB and LAI) to spectral information. Cultivar (C), growth stage (S) and planting pattern (P) were selected as candidates of random effects for the LME models due to their significant effects on rice growth. Compared to other regression models (SR, ANN and RF), the LME model improved the AGB estimation accuracy for all stage groups to varying degrees: the R2 increased by 0.14–0.35 and the RMSE decreased by 0.88–1.80 t ha−1 for the whole season, the R2 increased by 0.07–0.15 and the RMSE decreased by 0.31–0.61 t ha−1 for pre-heading stages and the R2 increased by 0.21–0.53 and the RMSE decreased by 0.72–1.52 t ha−1 for post-heading stages. Further analysis suggested that the LME model also successfully predicted within the groups when the number of groups was suitable. More importantly, depending on the availability of C, S, P or combinations thereof, mixed effects could lead to an outperformance of baseline retrieval methods (SR, ANN or RF) due to the inclusion of secondary effects. Satisfactory results were also obtained for the LAI estimation while the superiority of the LME model was not as significant as that for AGB estimation. This study demonstrates that the LME model could accurately estimate rice AGB and LAI and fixed-wing UAVs are promising for the monitoring of the crop growth status over large-scale farmland.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.