2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019
DOI: 10.1109/cvpr.2019.01152
|View full text |Cite
|
Sign up to set email alerts
|

Importance Estimation for Neural Network Pruning

Abstract: Structural pruning of neural network parameters reduces computation, energy, and memory transfer costs during inference. We propose a novel method that estimates the contribution of a neuron (filter) to the final loss and iteratively removes those with smaller scores. We describe two variations of our method using the first and secondorder Taylor expansions to approximate a filter's contribution. Both methods scale consistently across any network layer without requiring per-layer sensitivity analysis and can b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
430
1
6

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 665 publications
(522 citation statements)
references
References 22 publications
1
430
1
6
Order By: Relevance
“…In this section, we evaluate the proposed pruning methodology in terms of accuracy, compression achieved, as well as time and computational complexity of the pruning process. We compare our technique with several existing and state-ofthe-art pruning methods such as [5], [9], [16], [17], [32], [33]. We have used three different datasets namely CIFAR-10, CIFAR-100 and ImageNet and trained them on the VGGNet and ResNet architectures.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we evaluate the proposed pruning methodology in terms of accuracy, compression achieved, as well as time and computational complexity of the pruning process. We compare our technique with several existing and state-ofthe-art pruning methods such as [5], [9], [16], [17], [32], [33]. We have used three different datasets namely CIFAR-10, CIFAR-100 and ImageNet and trained them on the VGGNet and ResNet architectures.…”
Section: Resultsmentioning
confidence: 99%
“…Most of the existing pruning techniques usually involve the following three stages [4]: 1) Training an over-parameterized network till convergence, 2) Pruning based on predefined criteria, and 3) Fine-tuning/re-training to regain accuracy. The main limitation with this three-stage process adopted by existing pruning methods is that the pruning and fine-tuning stages, iterative in most cases [5]- [9], [38], impose considerable additional computation (hence, time and energy) requirements on top of the compute-heavy training stage. Based on the structure and criteria used for pruning, most of the previous works on network pruning also suffer from one or more of the following problems: ples of unstructured pruning methods include [5]- [11].…”
Section: Introductionmentioning
confidence: 99%
“…Fortunately, some studies have focused on the complexity reduction of STFT [25], which could simplify the computational process significantly. Besides, with the rapid developing of deep learning, the complexity of the network in TFDNet would be further reduced by the optimization methods such as pruning [26]. Most importantly, the construction of TFDNet may provide some inspiration about exploiting more image processing and analysis methods which are suitable to tackle problems in UVLC system.…”
Section: Results and Analysismentioning
confidence: 99%
“…By using various decoding strategies that were proven in this study, both speed and performance were improved without changes to the model structure. Performance may be further improved by integrating ideas proposed in other studies in which models were lightened through strategies such as network pruning [40], knowledge distillation [41], and quantization [15]. Hence, an effective beam search technique and a new decoding technique will be investigated in future studies.…”
Section: Discussionmentioning
confidence: 99%