2020
DOI: 10.1080/08839514.2020.1792034
|View full text |Cite
|
Sign up to set email alerts
|

Transfer Learning-Based Framework for Classification of Pest in Tomato Plants

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
23
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 54 publications
(24 citation statements)
references
References 24 publications
0
23
0
Order By: Relevance
“…The two-channel residual attention network model (B-ARNet) was used to identify the images with an accuracy of about 89%. Pattnaik et al [ 39 ] proposed a pre-trained deep CNN framework for transfer learning for pest classification in tomato plants, and achieved the highest classification accuracy of 88.83% using DenseNet 169 model. However, in actual production, tomato may suffer from a variety of diseases or pests at the same time.…”
Section: Introductionmentioning
confidence: 99%
“…The two-channel residual attention network model (B-ARNet) was used to identify the images with an accuracy of about 89%. Pattnaik et al [ 39 ] proposed a pre-trained deep CNN framework for transfer learning for pest classification in tomato plants, and achieved the highest classification accuracy of 88.83% using DenseNet 169 model. However, in actual production, tomato may suffer from a variety of diseases or pests at the same time.…”
Section: Introductionmentioning
confidence: 99%
“…The recognition of citrus flies is different from the other recognition tasks of crop pests [54,[61][62][63], e.g., the recognition of paddy pests [62] and tomato pests [63]. In those recognition tasks, most species of pests are from different families, and their appearances represent large variances.…”
Section: Discussion On Recognition Of Citrus Fliesmentioning
confidence: 99%
“…Pattnaik et al [34] suggested a transfer learning system for the classification of pests in tomato plants based on an existing deep CNN structure. The database for the analysis was obtained from 859 images in 10 groups from online sources.…”
Section: Related Workmentioning
confidence: 99%