Leaf area index (LAI) is a fundamental indicator of plant growth status in agronomic and environmental studies. Due to rapid advances in unmanned aerial vehicle (UAV) and sensor technologies, UAV-based remote sensing is emerging as a promising solution for monitoring crop LAI with great flexibility and applicability. This study aimed to determine the feasibility of combining color and texture information derived from UAV-based digital images for estimating LAI of rice (Oryza sativa L.). Rice field trials were conducted at two sites using different nitrogen application rates, varieties, and transplanting methods during 2016 to 2017. Digital images were collected using a consumer-grade UAV after sampling at key growth stages of tillering, stem elongation, panicle initiation and booting. Vegetation color indices (CIs) and grey level co-occurrence matrix-based textures were extracted from mosaicked UAV ortho-images for each plot. As a solution of using indices composed by two different textures, normalized difference texture indices (NDTIs) were calculated by two randomly selected textures. The relationships between rice LAIs and each calculated index were then compared using simple linear regression. Multivariate regression models with different input sets were further used to test the potential of combining CIs with various textures for rice LAI estimation. The results revealed that the visible atmospherically resistant index (VARI) based on three visible bands and the NDTI based on the mean textures derived from the red and green bands were the best for LAI retrieval in the CI and NDTI groups, respectively. Independent accuracy assessment showed that random forest (RF) exhibited the best predictive performance when combining CI and texture inputs (R2 = 0.84, RMSE = 0.87, MAE = 0.69). This study introduces a promising solution of combining color indices and textures from UAV-based digital imagery for rice LAI estimation. Future studies are needed on finding the best operation mode, suitable ground resolution, and optimal predictive methods for practical applications.
Unmanned aerial system (UAS)-based remote sensing is one promising technique for precision crop management, but few studies have reported the applications of such systems on nitrogen (N) estimation with multiple sensors in rice (Oryza sativa L.). This study aims to evaluate three sensors (RGB, color-infrared (CIR) and multispectral (MS) cameras) onboard UAS for the estimation of N status at individual stages and their combination with the field data collected from a two-year rice experiment. The experiments were conducted in 2015 and 2016, involving different N rates, planting densities and rice cultivars, with three replicates. An Oktokopter UAS was used to acquire aerial photography at early growth stages (from tillering to booting) and field samplings were taken at a near date. Two color indices (normalized excess green index (NExG), and normalized green red difference index (NGRDI)), two near infrared vegetation indices (green normalized difference vegetation index (GNDVI), and enhanced NDVI (ENDVI)) and two red edge vegetation indices (red edge chlorophyll index (CI red edge ), and DATT) were used to evaluate the capability of these three sensors in estimating leaf nitrogen accumulation (LNA) and plant nitrogen accumulation (PNA) in rice. The results demonstrated that the red edge vegetation indices derived from MS images produced the highest estimation accuracy for LNA (R 2 : 0.79-0.81, root mean squared error (RMSE): 1.43-1.45 g m −2 ) and PNA (R 2 : 0.81-0.84, RMSE: 2.27-2.38 g m −2 ). The GNDVI from CIR images yielded a moderate estimation accuracy with an all-stage model. Color indices from RGB images exhibited satisfactory performance for the pooled dataset of the tillering and jointing stages. Compared with the counterpart indices from the RGB and CIR images, the indices from the MS images performed better in most cases. These results may set strong foundations for the development of UAS-based rice growth monitoring systems, providing useful information for the real-time decision making on crop N management.
Unmanned aerial vehicle (UAV)-based remote sensing (RS) possesses the significant advantage of being able to efficiently collect images for precision agricultural applications. Although numerous methods have been proposed to monitor crop nitrogen (N) status in recent decades, just how to utilize an appropriate modeling algorithm to estimate crop leaf N content (LNC) remains poorly understood, especially based on UAV multispectral imagery. A comparative assessment of different modeling algorithms (i.e., simple and non-parametric modeling algorithms alongside the physical model retrieval method) for winter wheat LNC estimation is presented in this study. Experiments were conducted over two consecutive years and involved different winter wheat varieties, N rates, and planting densities. A five-band multispectral camera (i.e., 490 nm, 550 nm, 671 nm, 700 nm, and 800 nm) was mounted on a UAV to acquire canopy images across five critical growth stages. The results of this study showed that the best-performing vegetation index (VI) was the modified renormalized difference VI (RDVI), which had a determination coefficient (R2) of 0.73 and a root mean square error (RMSE) of 0.38. This method was also characterized by a high processing speed (0.03 s) for model calibration and validation. Among the 13 non-parametric modeling algorithms evaluated here, the random forest (RF) approach performed best, characterized by R2 and RMSE values of 0.79 and 0.33, respectively. This method also had the advantage of full optical spectrum utilization and enabled flexible, non-linear fitting with a fast processing speed (2.3 s). Compared to the other two methods assessed here, the use of a look up table (LUT)-based radiative transfer model (RTM) remained challenging with regard to LNC estimation because of low prediction accuracy (i.e., an R2 value of 0.62 and an RMSE value of 0.46) and slow processing speed. The RF approach is a fast and accurate technique for N estimation based on UAV multispectral imagery.
Plant nitrogen concentration (PNC) is a critical indicator of N status for crops, and can be used for N nutrition diagnosis and management. This work aims to explore the potential of multispectral imagery from unmanned aerial vehicle (UAV) for PNC estimation and improve the estimation accuracy with hyperspectral data collected in the field with a hyperspectral radiometer. In this study we combined selected vegetation indices (VIs) and texture information to estimate PNC in rice. The VIs were calculated from ground and aerial platforms and the texture information was obtained from UAV-based multispectral imagery. Two consecutive years (2015 & 2016) of experiments were conducted, involving different N rates, planting densities and rice cultivars. Both UAV flights and ground spectral measurements were taken along with destructive samplings at critical growth stages of rice (Oryza sativa L.). After UAV imagery preprocessing, both VIs and texture measurements were calculated. Then the optimal normalized difference texture index (NDTI) from UAV imagery was determined for separated stage groups and the entire season. Results demonstrated that aerial VIs performed well only for pre-heading stages (R2 = 0.52–0.70), and photochemical reflectance index and blue N index from ground (PRIg and BNIg) performed consistently well across all growth stages (R2 = 0.48–0.65 and 0.39–0.68). Most texture measurements were weakly related to PNC, but the optimal NDTIs could explain 61 and 51% variability of PNC for separated stage groups and entire season, respectively. Moreover, stepwise multiple linear regression (SMLR) models combining aerial VIs and NDTIs did not significantly improve the accuracy of PNC estimation, while models composed of BNIg and optimal NDTIs exhibited significant improvement for PNC estimation across all growth stages. Therefore, the integration of ground-based narrow band spectral indices with UAV-based textural information might be a promising technique in crop growth monitoring.
This paper evaluates the potential of integrating textural and spectral information from unmanned aerial vehicle (UAV)-based multispectral imagery for improving the quantification of nitrogen (N) status in rice crops. Vegetation indices (VIs), normalized difference texture indices (NDTIs), and their combination were used to estimate four N nutrition parameters leaf nitrogen concentration (LNC), leaf nitrogen accumulation (LNA), plant nitrogen concentration (PNC), and plant nitrogen accumulation (PNA). Results demonstrated that the normalized difference red-edge index (NDRE) performed best in estimating the N nutrition parameters among all the VI candidates. The optimal texture indices had comparable performance in N nutrition parameters estimation as compared to NDRE. Significant improvement for all N nutrition parameters could be obtained by integrating VIs with NDTIs using multiple linear regression. While tested across years and growth stages, the multivariate models also exhibited satisfactory estimation accuracy. For texture analysis, texture metrics calculated in the direction D3 (perpendicular to the row orientation) are recommended for monitoring row-planted crops. These findings indicate that the addition of textural information derived from UAV multispectral imagery could reduce the effects of background materials and saturation and enhance the N signals of rice canopies for the entire season.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.