2022
DOI: 10.1038/s41598-022-10140-z
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning-based approach for identification of diseases of maize crop

Abstract: In recent years, deep learning techniques have shown impressive performance in the field of identification of diseases of crops using digital images. In this work, a deep learning approach for identification of in-field diseased images of maize crop has been proposed. The images were captured from experimental fields of ICAR-IIMR, Ludhiana, India, targeted to three important diseases viz. Maydis Leaf Blight, Turcicum Leaf Blight and Banded Leaf and Sheath Blight in a non-destructive manner with varied backgrou… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
51
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 74 publications
(51 citation statements)
references
References 42 publications
(55 reference statements)
0
51
0
Order By: Relevance
“…Additionally, researchers are encouraged to publish the results achieved by using only the brightness to determine its effect. Haque et al [21] compared a model that was trained on rotation, distortion, and flipping to a model that was trained on brightness. The authors reported that the brightness-augmented model achieved slightly better results.…”
Section: -Results and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Additionally, researchers are encouraged to publish the results achieved by using only the brightness to determine its effect. Haque et al [21] compared a model that was trained on rotation, distortion, and flipping to a model that was trained on brightness. The authors reported that the brightness-augmented model achieved slightly better results.…”
Section: -Results and Discussionmentioning
confidence: 99%
“…Nazaré et al [20] reached a similar conclusion, suggesting that noisy images can degrade CNNs' performance and that the images' quality is crucial. Haque et al [21] trained an InceptionV3 model to classify maize crop leaves to detect healthy leaves. They noticed that the brightness in the dataset was not uniform because the dataset was taken with on-file and not in-labcontrolled settings.…”
Section: -Introductionmentioning
confidence: 99%
“…Several DL-based methods have been widely adopted for the classification of maize leaf disease due to their improved feature extraction and representation capabilities. Haque et al (2022) developed an Inceptionv3-based architecture for the classification of healthy maize leaves from diseased ones. Initially, several augmentation strategies such as flipping, rotating, skew, and distortion was applied to enhance the diversity of input data.…”
Section: Related Workmentioning
confidence: 99%
“…The CNNs are extensively applied for the categorization of various plant leaf diseases ( Sethy et al., 2020 ; Ngugi et al., 2021 ; Albattah et al., 2022 ). In a large number of studies, researchers fine-tuned pre-built CNN models such as AlexNet ( Rangarajan et al., 2018 ), GoogleNet ( Mohanty et al., 2016 ), ResNet ( Subramanian et al., 2022 ), InceptionNet ( Haque et al., 2022 ), Efficientnet ( Liu et al., 2020 ) and DenseNet ( Waheed et al., 2020 ; Baldota et al., 2021 ) employing transfer learning for leaf disease identification. Some of the studies have suggested novel CNN architecture for plant disease identification ( Picon et al., 2019 ; Agarwal et al., 2020 ; Zhang et al., 2021 ; Xiang et al., 2021 ).…”
Section: Introductionmentioning
confidence: 99%
“…In species-specific models, multiple disease categories of a particular crop type are considered for classification. For example, individual models are proposed for the recognition of diseases belonging to crop types, rice 15 , potato 16 , and maize 17 . In 15 , the deep CNN model trained to identify ten disease types of the rice plant achieved 95.48% accuracy, which is much higher than the existing approaches.…”
Section: Introductionmentioning
confidence: 99%