The use of deep learning models to identify lesions on cotton leaves on the basis of images of the crop in the field is proposed in this article. Cultivated in most of the world, cotton is one of the economically most important agricultural crops. Its cultivation in tropical regions has made it the target of a wide spectrum of agricultural pests and diseases, and efficient solutions are required. Moreover, the symptoms of the main pests and diseases cannot be differentiated in the initial stages, and the correct identification of a lesion can be difficult for the producer. To help resolve the problem, the present research provides a solution based on deep learning in the screening of cotton leaves which makes it possible to monitor the health of the cotton crop and make better decisions for its management. With the learning models GoogleNet and Resnet50 using convolutional neural networks, a precision of 86.6% and 89.2%, respectively, was obtained. Compared with traditional approaches for the processing of images such as support vector machines (SVM), Closest k-neighbors (KNN), artificial neural networks (ANN) and neuro-fuzzy (NFC), the convolutional neural networks proved to be up to 25% more precise, suggesting that this method can contribute to a more rapid and reliable inspection of the plants growing in the field.
The principal objective of agriculture is the production of a high yield of healthy crops. This yield may be improved by the automatic detection of diseases and the consequent reduction in the use of pesticides. A digital processing system for images was thus developed and used to identify lesions on the leaves of cotton plants. A collection of 60,659 images of sub-metric resolution showing samples of soil and both healthy and damaged leaves was obtained and processed with an algorithm for the extraction of texture from 102x102-pixel samples. Then they analyzed with a neuro-fuzzy classifier trained to discriminate the three types of regions (soil, healthy leaf, and lesioned leaf). The algorithm developed was able to recognize the three classes. It generated a great amount of information on recognition of background which was more consistent than leaf damage areas. Therefore, it surpassed the performance of areas of healthy leaves. A similar trend was found for sensitivity. The overall accuracy of the system was 71.2%, suggesting that the unbalanced data of the different classes had skewed the results of the algorithm, as the number of false positives for the less well represented classes was greater. The analysis of unbalance (F-Score) showed that, independent of the volume of data, the attributes of texture utilized yielded better results for the images containing areas of damage in relation to overall accuracy. Therefore, given the challenges involved in the automatic identification of lesions in agricultural crops, such as variations in illumination, color, and texture, as well as obstruction, overlapping, and complexity of the region of which the image was taken, the behavior of the model was deemed satisfactory. Given the hybrid nature of the model, it should contribute to the state of the art in the use of intelligent systems in agriculture. This algorithm is available at https://github.com/rafaeufg/Cotton-diseases
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.