Abstract:High elevation spruce forests of the European Alps are frequently infected by the needle rust Chrysomyxa rhododendri, a pathogen causing remarkable defoliation, reduced tree growth and limited rejuvenation. Exact quantification of the disease severity on different spatial scales is crucial for monitoring, management and resistance breeding activities. Based on the distinct yellow discolouration of attacked needles, it was investigated whether image analysis of digital photographs can be used to quantify diseas… Show more
“…Image capture using mobile platforms (UAVs, ground robots etc) is being studied in the field, although disease detection is the primary focus (Johnson et al 2003;Garcia-Ruiz et al 2013;de Castro et al 2015). Measurement of severity with VIS spectrum image analysis using mobile platforms is less common (Lelong et al 2008;Sugiura et al 2016;Duarte-Carvajalino et al 2018;Franceschini et al 2019;Ganthaler et al 2018;Liu et al 2018), but is an area of research need. An automated VIS image analysis system on a UAV for measuring severity had moderate precision compared to visual rating (R 2 = 0.73), but was deemed acceptable for rating potato resistance to late blight (Sugiura et al 2016).…”
Section: Application In Research and Practicementioning
The severity of plant diseases, traditionally the proportion of the plant tissue exhibiting symptoms, is a key quantitative variable to know for many diseases and is prone to error. Good quality disease severity data should be accurate (close to the true value). Earliest quantification of disease severity was by visual estimates. Sensor-based image analysis including visible spectrum and hyperspectral and multispectral sensors are established technologies that promise to substitute, or complement visual ratings. Indeed, these technologies have measured disease severity accurately under controlled conditions but are yet to demonstrate their full potential for accurate measurement under field conditions. Sensor technology is advancing rapidly, and artificial intelligence may help overcome issues for automating severity measurement under hyper-variable field conditions. The adoption of appropriate scales, training, instruction and aids (standard area diagrams) has contributed to improved accuracy of visual estimates. The apogee of accuracy for visual estimation is likely being approached, and any remaining increases in accuracy are likely to be small. Due to automation and rapidity, sensor-based measurement offers potential advantages compared with visual estimates, but the latter will remain important for years to come. Mobile, automated sensor-based systems will become increasingly common in controlled conditions and, eventually, in the field for measuring plant disease severity for the purpose of research and decision making.
“…Image capture using mobile platforms (UAVs, ground robots etc) is being studied in the field, although disease detection is the primary focus (Johnson et al 2003;Garcia-Ruiz et al 2013;de Castro et al 2015). Measurement of severity with VIS spectrum image analysis using mobile platforms is less common (Lelong et al 2008;Sugiura et al 2016;Duarte-Carvajalino et al 2018;Franceschini et al 2019;Ganthaler et al 2018;Liu et al 2018), but is an area of research need. An automated VIS image analysis system on a UAV for measuring severity had moderate precision compared to visual rating (R 2 = 0.73), but was deemed acceptable for rating potato resistance to late blight (Sugiura et al 2016).…”
Section: Application In Research and Practicementioning
The severity of plant diseases, traditionally the proportion of the plant tissue exhibiting symptoms, is a key quantitative variable to know for many diseases and is prone to error. Good quality disease severity data should be accurate (close to the true value). Earliest quantification of disease severity was by visual estimates. Sensor-based image analysis including visible spectrum and hyperspectral and multispectral sensors are established technologies that promise to substitute, or complement visual ratings. Indeed, these technologies have measured disease severity accurately under controlled conditions but are yet to demonstrate their full potential for accurate measurement under field conditions. Sensor technology is advancing rapidly, and artificial intelligence may help overcome issues for automating severity measurement under hyper-variable field conditions. The adoption of appropriate scales, training, instruction and aids (standard area diagrams) has contributed to improved accuracy of visual estimates. The apogee of accuracy for visual estimation is likely being approached, and any remaining increases in accuracy are likely to be small. Due to automation and rapidity, sensor-based measurement offers potential advantages compared with visual estimates, but the latter will remain important for years to come. Mobile, automated sensor-based systems will become increasingly common in controlled conditions and, eventually, in the field for measuring plant disease severity for the purpose of research and decision making.
“…When the calibration and training steps were performed, the reliability and accuracy obtained by RUST was very high and comparable to that obtained by previous work. For instance, Ganthaler et al [ 44 ] reported coefficients of determination between 0.87 (natural background) and 0.95 (black background) when they compared the evaluation of the distinct yellow discoloration of rust attacked needles of spruce forest by image analysis and conventional methods. Similarly, Bock et al [ 45 ] reported bias correction factors between 0.93 and 0.99 in their comparison of the number of citrus canker lesions on grapefruit leaves estimated with Assess or by visual ratings, indicating that image analysis was more reliable when repeated compared to visual raters.…”
Recently, phenotyping has become one of the main bottlenecks in plant breeding and fundamental plant science. This is particularly true for plant disease assessment, which has to deal with time-consuming evaluations and the subjectivity of visual assessments. In this work, we have developed an open source Robust, User-friendy Script Tool (RUST) for semi-automated evaluation of leaf rust diseases. RUST runs under the free Fiji imaging software (developed from ImageJ), which is a well-recognized software among the scientific community. The script enables the evaluation of leaf rust diseases using a color transformation tool and provides three different automation modes. The script opens images sequentially and records infection frequency (pustules per area) (semi-)automatically for high-throughput analysis. Furthermore, it can manage several scanned leaf segments in the same image, consecutively selecting the desired segments. The script has been validated with nearly 900 samples from 80 oat genotypes ranging from resistant to susceptible and from very light to heavily infected leaves showing a high accuracy with a Lin’s concordance correlation coefficient of 0.99. The analysis show a high repeatability as indicated by the low variation coefficients obtained when repeating the measurement of the same samples. The script also has optional steps for calibration and training to ensure accuracy, even in low-resolution images. This script can evaluate efficiently hundreds of leaves facilitating the screening of novel sources of resistance to this important cereal disease.
“…Soil sampling was conducted to determine possible origins of nutritional deficiencies [109] and to measure the field capacity [147]. Ground-based photographs were taken to calculate canopy cover [148], for the documentation of weather conditions [137], and for an improved categorization of pest infestation [44,118,142,146], disease [122], and fire [149] severity classes. Smigaj et al [136] collected data from intratrunk water flow, canopy temperature, soil moisture, and incident and reflected light using an array of sensors.…”
Section: Complementary Data: Fieldwork and Traditional Remote Sensing...mentioning
confidence: 99%
“…In some cases, spectral data were enriched with structural information derived from point clouds, DSMs or CHMs based on either LiDAR [116][117][118] or image data [96,102,119,120], further improving classification results. In a few papers, the raw drone images were directly analyzed without further processing [44,108,[121][122][123][124][125][126]. To create photogrammetric products, the researchers predominantly implemented commercial SfM software such as Agisoft Metashape (Agisoft LLC, St. Petersburg, Russia) and Pix4D (Pix4D S.A., Lausanne, Switzerland).…”
In recent years, technological advances have led to the increasing use of unmanned aerial vehicles (UAVs) for forestry applications. One emerging field for drone application is forest health monitoring (FHM). Common approaches for FHM involve small-scale resource-extensive fieldwork combined with traditional remote sensing platforms. However, the highly dynamic nature of forests requires timely and repetitive data acquisition, often at very high spatial resolution, where conventional remote sensing techniques reach the limits of feasibility. UAVs have shown that they can meet the demands of flexible operation and high spatial resolution. This is also reflected in a rapidly growing number of publications using drones to study forest health. Only a few reviews exist which do not cover the whole research history of UAV-based FHM. Since a comprehensive review is becoming critical to identify research gaps, trends, and drawbacks, we offer a systematic analysis of 99 papers covering the last ten years of research related to UAV-based monitoring of forests threatened by biotic and abiotic stressors. Advances in drone technology are being rapidly adopted and put into practice, further improving the economical use of UAVs. Despite the many advantages of UAVs, such as their flexibility, relatively low costs, and the possibility to fly below cloud cover, we also identified some shortcomings: (1) multitemporal and long-term monitoring of forests is clearly underrepresented; (2) the rare use of hyperspectral and LiDAR sensors must drastically increase; (3) complementary data from other RS sources are not sufficiently being exploited; (4) a lack of standardized workflows poses a problem to ensure data uniformity; (5) complex machine learning algorithms and workflows obscure interpretability and hinders widespread adoption; (6) the data pipeline from acquisition to final analysis often relies on commercial software at the expense of open-source tools.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.