2022
DOI: 10.1007/s00521-022-07161-0
|View full text |Cite
|
Sign up to set email alerts
|

Compression of deep neural networks: bridging the gap between conventional-based pruning and evolutionary approach

Abstract: Recently, many studies have been carried out on model compression to handle the high computational cost and high memory footprint brought by the implementation of deep neural networks. In this paper, model compression of convolutional neural networks is constructed as a multiobjective optimization problem with two conflicting objectives, reducing the model size and improving the performance. A novel structured pruning method called Conventional-based and Evolutionary Approaches Guided Multiobjective Pruning (C… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(1 citation statement)
references
References 37 publications
0
1
0
Order By: Relevance
“…Efficiency tests were conducted on more complex datasets and models to validate these assertions. The Cifar100 dataset was employed as the training dataset, while ResNet-50 served as the training model [28]. The parameters were set as follows: 𝐡 = ∞ , 𝐸 = 3 , 𝑛 = π‘˜ = 10 , π‘š = 2 , and 𝑅 = 100 .…”
Section: Efficiency Analysismentioning
confidence: 99%
“…Efficiency tests were conducted on more complex datasets and models to validate these assertions. The Cifar100 dataset was employed as the training dataset, while ResNet-50 served as the training model [28]. The parameters were set as follows: 𝐡 = ∞ , 𝐸 = 3 , 𝑛 = π‘˜ = 10 , π‘š = 2 , and 𝑅 = 100 .…”
Section: Efficiency Analysismentioning
confidence: 99%