2021 International Joint Conference on Neural Networks (IJCNN) 2021
DOI: 10.1109/ijcnn52387.2021.9533582
|View full text |Cite
|
Sign up to set email alerts
|

COPS: Controlled Pruning Before Training Starts

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…Most traditional PaI methods, such as LTH, dramatically raise the pruning cost since iterative pruning is needed. A distinctive branch of PaI, One-shot Network Pruning at Initialization (OPaI), solves this problem by computing the importance of each parameter in a neural network in a single step [2,5,17,24,30,33,38,39,42]. Two basic OPaI methods are SNIP [22] and GraSP [37].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Most traditional PaI methods, such as LTH, dramatically raise the pruning cost since iterative pruning is needed. A distinctive branch of PaI, One-shot Network Pruning at Initialization (OPaI), solves this problem by computing the importance of each parameter in a neural network in a single step [2,5,17,24,30,33,38,39,42]. Two basic OPaI methods are SNIP [22] and GraSP [37].…”
Section: Related Workmentioning
confidence: 99%
“…The batch size is 128. The optimization is SGD with a learning rate of 0.1 at the beginning, multiplying 0.1 at epochs [30,60,90], a momentum of 0.9, and a weight decay of 0.0001.…”
Section: Dop: Pruning Resnet-50 With Varying Levels Of Sparsitymentioning
confidence: 99%