Stripe rust (Pst) is a major disease of wheat crops leading untreated to severe yield losses. The use of fungicides is often essential to control Pst when sudden outbreaks are imminent. Sensors capable of detecting Pst in wheat crops could optimize the use of fungicides and improve disease monitoring in high-throughput field phenotyping. Now, deep learning provides new tools for image recognition and may pave the way for new camera based sensors that can identify symptoms in early stages of a disease outbreak within the field. The aim of this study was to teach an image classifier to detect Pst symptoms in winter wheat canopies based on a deep residual neural network (ResNet). For this purpose, a large annotation database was created from images taken by a standard RGB camera that was mounted on a platform at a height of 2 m. Images were acquired while the platform was moved over a randomized field experiment with Pst-inoculated and Pst-free plots of winter wheat. The image classifier was trained with 224 × 224 px patches tiled from the original, unprocessed camera images. The image classifier was tested on different stages of the disease outbreak. At patch level the image classifier reached a total accuracy of 90%. To test the image classifier on image level, the image classifier was evaluated with a sliding window using a large striding length of 224 px allowing for fast test performance. At image level, the image classifier reached a total accuracy of 77%. Even in a stage with very low disease spreading (0.5%) at the very beginning of the Pst outbreak, a detection accuracy of 57% was obtained. Still in the initial phase of the Pst outbreak with 2 to 4% of Pst disease spreading, detection accuracy with 76% could be attained. With further optimizations, the image classifier could be implemented in embedded systems and deployed on drones, vehicles or scanning systems for fast mapping of Pst outbreaks.
Unmanned aerial vehicles (UAVs) have the potential to monitor the health status of several crop fields of a farmer in full coverage. For a complete and fast monitoring, however, high flight altitudes are usually needed, especially if large areas should be observed in short time intervals. In this case, the resolution on the ground becomes insufficient to detect specific symptoms of crop diseases because a ground resolution in the submillimeter scale is needed. This study pursued the idea of performing remote UAV imaging to detect discoloration in combination with near‐surface tractor imaging to detect the uredospore layers as characteristic signs of stripe (yellow) rust (caused by Puccinia striiformis Westend. f. sp. tritici) in winter wheat (Triticum aestivum L.). To simulate healthy and diseased field parts, the 3‐yr experimental design included controlled infected plots and those sprayed with fungicides as healthy controls. Imaging, disease severity, and crop development rating were performed along a time series. Significant differences between infected and control plots occurred in the UAV imagery using the normalized green–red difference index from a median (upper three leaves) infested leaf area of 3% and for the tractor images using the maximally stable extreme regions detector from 3 and 5%, respectively. In the future, it is conceivable that farmers will combine UAV (aerial monitoring of crop damage of complete fields) and tractor (ground monitoring to determine the cause) imaging for automatic scanning of the health status.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.