2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2021
DOI: 10.1109/cvpr46437.2021.00498
|View full text |Cite
|
Sign up to set email alerts
|

Manifold Regularized Dynamic Network Pruning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
49
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 66 publications
(49 citation statements)
references
References 37 publications
0
49
0
Order By: Relevance
“…FBS [65] pruning 3.95% FLOPs higher than that of AIP, and its Top-1 and Top-5 accuracy loss is also higher than ours by 0.93% (1.59% vs. 0.66%) and 0.34% (0.86% and 0.52%) respectively. When k=0.5, the cutting rate of FLOPs using AIP is 5.45% lower than that of ManiDP [27], and the Top-5 accuracy drop is 0.20% higher (0.52% vs. 0.32%), but the Top-1 accuracy loss is 0.22% lower (0.66% vs. 0.88%). When k=0.7, AIP deletes 58.51% of the parameters and 65.07% of the FLOPs.…”
Section: Results Comparison On Ilsvrc-2012mentioning
confidence: 96%
See 2 more Smart Citations
“…FBS [65] pruning 3.95% FLOPs higher than that of AIP, and its Top-1 and Top-5 accuracy loss is also higher than ours by 0.93% (1.59% vs. 0.66%) and 0.34% (0.86% and 0.52%) respectively. When k=0.5, the cutting rate of FLOPs using AIP is 5.45% lower than that of ManiDP [27], and the Top-5 accuracy drop is 0.20% higher (0.52% vs. 0.32%), but the Top-1 accuracy loss is 0.22% lower (0.66% vs. 0.88%). When k=0.7, AIP deletes 58.51% of the parameters and 65.07% of the FLOPs.…”
Section: Results Comparison On Ilsvrc-2012mentioning
confidence: 96%
“…In this section, we compare the proposed method with the existing pruning schemes, among which Li et al [22], SFP [23], DCP [51], FPGM [52], EDP [53], CNN-FCF [54], CCP [55], Taylor-FO-BN [56], HRank [25], ManiDP [27] are the state-of-the-art methods. Due to the difference in experimental equipment and environment, the results obtained by different papers also have several differences.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Lin et al [55] proposed a mathematical method for filter pruning of low-rank feature maps for the characteristics that the low-rank feature maps contain less information and the pruned results can be easily replicated. Tang et al [56] proposed a dynamic pruning method that takes into account the constraints of sample complexity and network complexity in the pruning process to obtain better performance. In this work, filter pruning and channel pruning are used for both YOLO v4-tiny and MobileNet SSD pruning.…”
Section: Model Pruningmentioning
confidence: 99%
“…A number of recent works also focus on pruning redundant network connections [9,23,28,29,36,40]. Since deep networks typically possess many redundant weights, some other works focus on quantizing the weights in the hope for faster inference [18,19].…”
Section: Related Work Model Acceleration Techniquesmentioning
confidence: 99%