A Light Detection and Ranging (LiDAR) sensor mounted on an Unmanned Aerial Vehicle (UAV) can map the overflown environment in point clouds. Mapped canopy heights allow for the estimation of crop biomass in agriculture. The work presented in this paper contributes to sensory UAV setup design for mapping and textual analysis of agricultural fields. LiDAR data are combined with data from Global Navigation Satellite System (GNSS) and Inertial Measurement Unit (IMU) sensors to conduct environment mapping for point clouds. The proposed method facilitates LiDAR recordings in an experimental winter wheat field. Crop height estimates ranging from 0.35–0.58 m are correlated to the applied nitrogen treatments of 0–300 kgNha. The LiDAR point clouds are recorded, mapped, and analysed using the functionalities of the Robot Operating System (ROS) and the Point Cloud Library (PCL). Crop volume estimation is based on a voxel grid with a spatial resolution of 0.04 × 0.04 × 0.001 m. Two different flight patterns are evaluated at an altitude of 6 m to determine the impacts of the mapped LiDAR measurements on crop volume estimations.
GrassClover is a diverse image and biomass dataset collected in an outdoor agricultural setting. The images contain dense populations of grass and clover mixtures with heavy occlusions and occurrences of weeds. Fertilization and treatment of mixed crops depend on the local species composition. Therefore, the overall challenge is related to predicting the species composition in the canopy image and in the biomass. The dataset is collected with three different acquisition systems with ground sampling distances of 4-8 px mm −1. The observed mixed crops vary both in setting (field vs plot trial), seed compositions, yield, years since establishment and time of the season. Synthetic training images with pixel-wise hierarchical and instance labels are provided for supervised training. 31 600 unlabeled images are additionally provided for pre-training, semi-supervised training or unsupervised training. Furthermore, this paper provides challenges of semantic segmentation and prediction of the biomass compositions and a baseline model for this dataset.
Optimal fertilization of clover-grass fields relies on knowledge of the clover and grass fractions. This study shows how knowledge can be obtained by analyzing images collected in fields automatically. A fully convolutional neural network was trained to create a pixel-wise classification of clover, grass, and weeds in red, green, and blue (RGB) images of clover-grass mixtures. The estimated clover fractions of the dry matter from the images were found to be highly correlated with the real clover fractions of the dry matter, making this a cheap and non-destructive way of monitoring clover-grass fields. The network was trained solely on simulated top-down images of clover-grass fields. This enables the network to distinguish clover, grass, and weed pixels in real images. The use of simulated images for training reduces the manual labor to a few hours, as compared to more than 3000 h when all the real images are annotated for training. The network was tested on images with varied clover/grass ratios and achieved an overall pixel classification accuracy of 83.4%, while estimating the dry matter clover fraction with a standard deviation of 7.8%.
Crop mixtures are often beneficial in crop rotations to enhance resource utilization and yield stability. While targeted management, dependent on the local species composition, has the potential to increase the crop value, it comes at a higher expense in terms of field surveys. As fine-grained species distribution mapping of within-field variation is typically unfeasible, the potential of targeted management remains an open research area. In this work, we propose a new method for determining the biomass species composition from high resolution color images using a DeepLabv3+ based convolutional neural network. Data collection has been performed at four separate experimental plot trial sites over three growing seasons. The method is thoroughly evaluated by predicting the biomass composition of different grass clover mixtures using only an image of the canopy. With a relative biomass clover content prediction of R2 = 0.91, we present new state-of-the-art results across the largely varying sites. Combining the algorithm with an all terrain vehicle (ATV)-mounted image acquisition system, we demonstrate a feasible method for robust coverage and species distribution mapping of 225 ha of mixed crops at a median capacity of 17 ha per hour at 173 images per hectare.
Data sharing in research is important in order to reproduce results, develop global models, and benchmark methods. This paper presents a dataset containing image and field data from a field plot experiment with oil radish (Raphanus sativus L. var oleiformis) as catch crop after spring barley. The field data consists of fresh weight, dry weight, Carbon content and Nitrogen content from multiple weekly plant samples collected from the plots. The image data consists of images collected weekly prior to the plant samples. A subset of the images corresponding to the plant sampling areas have been annotated pixelwise. In addition to the image and field data, weather data from the growing period is also included in the dataset. The dataset is accompanied by two challenges: 1) semantic segmentation of crops and 2) oil radish yield estimation. The former challenge focuses on data image, while the latter focuses on the field data. Baseline methods and results are provided for both challenges.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.