2020
DOI: 10.3390/app10196866
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning Models Compression for Agricultural Plants

Abstract: Deep learning has been successfully showing promising results in plant disease detection, fruit counting, yield estimation, and gaining an increasing interest in agriculture. Deep learning models are generally based on several millions of parameters that generate exceptionally large weight matrices. The latter requires large memory and computational power for training, testing, and deploying. Unfortunately, these requirements make it difficult to deploy on low-cost devices with limited resources that are prese… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
16
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(20 citation statements)
references
References 32 publications
(30 reference statements)
2
16
0
Order By: Relevance
“…For the VGG16 and VGG 19 models, the accuracy achieved by of models is 81.3% and 96.25%, respectively [ 49 ]. The combination of pruning and post-quantization was applied to VGG16, AlexNet, and LeNet model [ 50 ]. The pruning step was responsible for reducing the model size.…”
Section: Resultsmentioning
confidence: 99%
“…For the VGG16 and VGG 19 models, the accuracy achieved by of models is 81.3% and 96.25%, respectively [ 49 ]. The combination of pruning and post-quantization was applied to VGG16, AlexNet, and LeNet model [ 50 ]. The pruning step was responsible for reducing the model size.…”
Section: Resultsmentioning
confidence: 99%
“…Considering that processing is performed on large quantities of fruit and that there may be time and energy restrictions, model pruning is performed on the network to explore the possibility of smaller model sizes, more apt for real-world usage [49,50], through polynomial decay. For each model, 9 pruning experiments were performed with weight sparsity ranging from 0.9 (10% of original size) to 0.1 (90% of original size).…”
Section: Classification Model Analysis and Pruningmentioning
confidence: 99%
“…As demonstrated in Figure 1 from the work of Han et al [29], all three compression techniques under the right conditions retain the prediction accuracy of the original model. Regardless, some studies have found that the pruning ratio affects the accuracy of the model [38], [39]. In effect, a slight reduction in accuracy is possible depending on the percentage of the model's trainable weights that are pruned [38]- [40].…”
Section: Background and Related Workmentioning
confidence: 99%
“…Regardless, some studies have found that the pruning ratio affects the accuracy of the model [38], [39]. In effect, a slight reduction in accuracy is possible depending on the percentage of the model's trainable weights that are pruned [38]- [40]. The current study makes the following contributions: theoretically, the study presents an approach that combines well-known DL techniques to reduce model complexity such that they require less expensive equipment to run without the performance degradation demonstrated in past studies; and practically, the proposed approach which solves some of the issues with DL at the edge could be employed in different contexts other than precision agriculture.…”
Section: Background and Related Workmentioning
confidence: 99%