Abstract:Leaf area index (LAI) is a significant biophysical variable in the models of hydrology, climatology and crop growth. Rapid monitoring of LAI is critical in modern precision agriculture. Remote sensing (RS) on satellite, aerial and unmanned aerial vehicles (UAVs) has become a popular technique in monitoring crop LAI. Among them, UAVs are highly attractive to researchers and agriculturists. However, some of the UAVs vegetation index (VI)-derived LAI models have relatively low accuracy because of the limited numb… Show more
“…In general, the implementation of UAS in agriculture has been focused on the extraction of information at the "canopy scale" for further biophysical and yield prediction [21,22]. This approach has been extensively reported via integration of UAS and sensors: RGB, multi-spectral, hyperspectral, and thermal imagery had been used to estimate biomass [23], LAI [23][24][25][26][27][28], canopy height [21,23,29,30], nitrogen [27,31,32], chlorophyll [32,33], and temperature [34][35][36]. Recently, Jin et al [37] estimated plant density in wheat from UAS observations using a RGB sensor, ultra-high-resolution imagery, and a support vector machine classifier.…”
Corn (Zea mays L.) is one of the most sensitive crops to planting pattern and early-season uniformity. The most common method to determine number of plants is by visual inspection on the ground but this field activity becomes time-consuming, labor-intensive, biased, and may lead to less profitable decisions by farmers. The objective of this study was to develop a reliable, timely, and unbiased method for counting corn plants based on ultra-high-resolution imagery acquired from unmanned aerial systems (UAS) to automatically scout fields and applied to real field conditions. A ground sampling distance of 2.4 mm was targeted to extract information at a plant-level basis. First, an excess greenness (ExG) index was used to individualized green pixels from the background, then rows and inter-row contours were identified and extracted. A scalable training procedure was implemented using geometric descriptors as inputs of the classifier. Second, a decision tree was implemented and tested using two training modes in each site to expose the workflow to different ground conditions at the time of the aerial data acquisition. Differences in performance were due to training modes and spatial resolutions in the two sites. For an object classification task, an overall accuracy of 0.96, based on the proportion of corrected assessment of corn and non-corn objects, was obtained for local (per-site) classification, and an accuracy of 0.93 was obtained for the combined training modes. For successful model implementation, plants should have between two to three leaves when images are collected (avoiding overlapping between plants). Best workflow performance was reached at 2.4 mm resolution corresponding to 10 m of altitude (lower altitude); higher altitudes were gradually penalized. The latter was coincident with the larger number of detected green objects in the images and the effectiveness of geometry as descriptor for corn plant detection.
“…In general, the implementation of UAS in agriculture has been focused on the extraction of information at the "canopy scale" for further biophysical and yield prediction [21,22]. This approach has been extensively reported via integration of UAS and sensors: RGB, multi-spectral, hyperspectral, and thermal imagery had been used to estimate biomass [23], LAI [23][24][25][26][27][28], canopy height [21,23,29,30], nitrogen [27,31,32], chlorophyll [32,33], and temperature [34][35][36]. Recently, Jin et al [37] estimated plant density in wheat from UAS observations using a RGB sensor, ultra-high-resolution imagery, and a support vector machine classifier.…”
Corn (Zea mays L.) is one of the most sensitive crops to planting pattern and early-season uniformity. The most common method to determine number of plants is by visual inspection on the ground but this field activity becomes time-consuming, labor-intensive, biased, and may lead to less profitable decisions by farmers. The objective of this study was to develop a reliable, timely, and unbiased method for counting corn plants based on ultra-high-resolution imagery acquired from unmanned aerial systems (UAS) to automatically scout fields and applied to real field conditions. A ground sampling distance of 2.4 mm was targeted to extract information at a plant-level basis. First, an excess greenness (ExG) index was used to individualized green pixels from the background, then rows and inter-row contours were identified and extracted. A scalable training procedure was implemented using geometric descriptors as inputs of the classifier. Second, a decision tree was implemented and tested using two training modes in each site to expose the workflow to different ground conditions at the time of the aerial data acquisition. Differences in performance were due to training modes and spatial resolutions in the two sites. For an object classification task, an overall accuracy of 0.96, based on the proportion of corrected assessment of corn and non-corn objects, was obtained for local (per-site) classification, and an accuracy of 0.93 was obtained for the combined training modes. For successful model implementation, plants should have between two to three leaves when images are collected (avoiding overlapping between plants). Best workflow performance was reached at 2.4 mm resolution corresponding to 10 m of altitude (lower altitude); higher altitudes were gradually penalized. The latter was coincident with the larger number of detected green objects in the images and the effectiveness of geometry as descriptor for corn plant detection.
“…Three cameras were mounted onboard the UAV separately for image collection and their technical specifications were listed in Table 2. The Tetracam mini-MCA6 (Tetracam Inc., Chatsworth, CA, USA) MS camera has six channels and was evaluated in the literature for other purposes [32,[38][39][40]. The camera has user configurable band pass filters (Andover Corporation, Salem, NH, USA) of 10-nm full-width at half-maximum and center wavelengths at blue (490 nm), green (550 nm), red (680 nm), red edge (720 nm), NIR1 (800 nm) and NIR2 (900 nm).…”
Section: Uav Campaigns and Sensorsmentioning
confidence: 99%
“…They were applied to water stress detection [26], disease detection [27] and vigor monitoring [28,29]. The UAS images in the aforementioned studies have been used to estimate the agronomic parameters LAI [30][31][32] and biomass [24,33]. N status, as one of the most important agronomic parameters in precision farming, needs to be addressed with UAS due to the low efficiency of other RS techniques.…”
Unmanned aerial system (UAS)-based remote sensing is one promising technique for precision crop management, but few studies have reported the applications of such systems on nitrogen (N) estimation with multiple sensors in rice (Oryza sativa L.). This study aims to evaluate three sensors (RGB, color-infrared (CIR) and multispectral (MS) cameras) onboard UAS for the estimation of N status at individual stages and their combination with the field data collected from a two-year rice experiment. The experiments were conducted in 2015 and 2016, involving different N rates, planting densities and rice cultivars, with three replicates. An Oktokopter UAS was used to acquire aerial photography at early growth stages (from tillering to booting) and field samplings were taken at a near date. Two color indices (normalized excess green index (NExG), and normalized green red difference index (NGRDI)), two near infrared vegetation indices (green normalized difference vegetation index (GNDVI), and enhanced NDVI (ENDVI)) and two red edge vegetation indices (red edge chlorophyll index (CI red edge ), and DATT) were used to evaluate the capability of these three sensors in estimating leaf nitrogen accumulation (LNA) and plant nitrogen accumulation (PNA) in rice. The results demonstrated that the red edge vegetation indices derived from MS images produced the highest estimation accuracy for LNA (R 2 : 0.79-0.81, root mean squared error (RMSE): 1.43-1.45 g m −2 ) and PNA (R 2 : 0.81-0.84, RMSE: 2.27-2.38 g m −2 ). The GNDVI from CIR images yielded a moderate estimation accuracy with an all-stage model. Color indices from RGB images exhibited satisfactory performance for the pooled dataset of the tillering and jointing stages. Compared with the counterpart indices from the RGB and CIR images, the indices from the MS images performed better in most cases. These results may set strong foundations for the development of UAS-based rice growth monitoring systems, providing useful information for the real-time decision making on crop N management.
“…Although these studies obtained a better LAI and stem height estimation accuracy than the current study, more time was required to construct the algorithm model and calculation. Better results may be achieved with lidar, hyperspectral data, or unmanned aerial vehicle data [7,8]; however, data collection depends on weather conditions, which can prevent the continuous monitoring of crops. This study was also the first time the feasibility of using a model built during the whole crop growth cycle was tested as an alternative to models built at each stage of crop growth.…”
Abstract:In this study, 27 polarimetric parameters were extracted from Radarsat-2 polarimetric synthetic aperture radar (SAR) at each growth stage of the rape crop. The sensitivity to growth parameters such as stem height, leaf area index (LAI), and biomass were investigated as a function of days after sowing. Based on the sensitivity analysis, five empirical regression models were compared to determine the best model for stem height, LAI, and biomass inversion. Of these five models, quadratic models had higher R 2 values than other models in most cases of growth parameter inversions, but when these results were related to physical scattering mechanisms, the inversion results produced overestimation in the performance of some parameters. By contrast, linear and logarithmic models, which had lower R 2 values than the quadratic models, had stable performance for growth parameter inversions, particularly in terms of their performance at each growth stage. The best biomass inversion performance was acquired by the volume component of a quadratic model, with an R 2 value of 0.854 and root mean square error (RMSE) of 109.93 g m −2 . The best LAI inversion was also acquired by a quadratic model, but used the radar vegetation index (Cloude), with an R 2 value of 0.8706 and RMSE of 0.56 m 2 m −2 . Stem height was acquired by scattering angle alpha (α) using a logarithmic model, with an R 2 of 0.926 value and RMSE of 11.09 cm. The performances of these models were also analysed for biomass estimation at the second growth stage (P2), third growth stage (P3), and fourth growth stage (P4). The results showed that the models built at the P3 stage had better substitutability with the models built during all of the growth stages. From the mapping results, we conclude that a model built at the P3 stage can be used for rape biomass inversion, with 90% of estimation errors being less than 100 g m −2 .
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.