There is an increasing interest in agricultural robotics and precision farming. In such domains, relevant datasets are often hard to obtain, as dedicated fields need to be maintained and the timing of the data collection is critical. In this paper, we present a large-scale agricultural robot dataset for plant classification as well as localization and mapping that covers the relevant growth stages of plants for robotic intervention and weed control. We used a readily available agricultural field robot to record the dataset on a sugar beet farm near Bonn in Germany over a period of three months in the spring of 2016. On average, we recorded data three times per week, starting at the emergence of the plants and stopping at the state when the field was no longer accessible to the machinery without damaging the crops. The robot carried a four-channel multi-spectral camera and an RGB-D sensor to capture detailed information about the plantation. Multiple lidar and global positioning system sensors as well as wheel encoders provided measurements relevant to localization, navigation, and mapping. All sensors had been calibrated before the data acquisition campaign. In addition to the data recorded by the robot, we provide lidar data of the field recorded using a terrestrial laser scanner. We believe this dataset will help researchers to develop autonomous systems operating in agricultural field environments. The dataset can be downloaded from http://www.ipb.uni-bonn.de/data/sugarbeets2016/.
Conventional farming still relies on large quantities of agrochemicals for weed management which have several negative side‐effects on the environment. Autonomous robots offer the potential to reduce the amount of chemicals applied, as robots can monitor and treat each plant in the field individually and thereby circumventing the uniform chemical treatment of the whole field. Such agricultural robots need the ability to identify individual crops and weeds in the field using sensor data and must additionally select effective treatment methods based on the type of weed. For example, certain types of weeds can only be effectively treated mechanically due to their resistance to herbicides, whereas other types can be treated trough selective spraying. In this article, we present a novel system that provides the necessary information for effective plant‐specific treatment. It estimates the stem location for weeds, which enables the robots to perform precise mechanical treatment, and at the same time provides the pixel‐accurate area covered by weeds for treatment through selective spraying. The major challenge in developing such a system is the large variability in the visual appearance that occurs in different fields. Thus, an effective classification system has to robustly handle substantial environmental changes including varying weed pressure, various weed types, different growth stages, changing visual appearance of the plants and the soil. Our approach uses an end‐to‐end trainable fully convolutional network that simultaneously estimates plant stem positions as well as the spatial extent of crop plants and weeds. It jointly learns how to detect the stems and the pixel‐wise semantic segmentation and incorporates spatial information by considering image sequences of local field strips. The jointly learned feature representation for both tasks furthermore exploits the crop arrangement information that is often present in crop fields. This information is considered even if it is only observable from the image sequences and not a single image. Such image sequences, as typically provided by robots navigating over the field along crop rows, enable our approach to robustly estimate the semantic segmentation and stem positions despite the large variations encountered in different fields. We implemented and thoroughly tested our approach on images from multiple farms in different countries. The experiments show that our system generalizes well to previously unseen fields under varying environmental conditions—a key capability to deploy such systems in the real world. Compared to state‐of‐the‐art approaches, our approach generalizes well to unseen fields and not only substantially improves the stem detection accuracy, that is, distinguishing crop and weed stems, but also improves the semantic segmentation performance.
Applying agrochemicals is the default procedure for conventional weed control in crop production, but has negative impacts on the environment. Robots have the potential to treat every plant in the field individually and thus can reduce the required use of such chemicals. To achieve that, robots need the ability to identify crops and weeds in the field and must additionally select effective treatments. While certain types of weed can be treated mechanically, other types need to be treated by (selective) spraying. In this paper, we present an approach that provides the necessary information for effective plant-specific treatment. It outputs the stem location for weeds, which allows for mechanical treatments, and the covered area of the weed for selective spraying. Our approach uses an end-toend trainable fully convolutional network that simultaneously estimates stem positions as well as the covered area of crops and weeds. It jointly learns the class-wise stem detection and the pixel-wise semantic segmentation. Experimental evaluations on different real-world datasets show that our approach is able to reliably solve this problem. Compared to state-of-theart approaches, our approach not only substantially improves the stem detection accuracy, i.e., distinguishing crop and weed stems, but also provides an improvement in the semantic segmentation performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.