2022
DOI: 10.1109/tcsvt.2021.3071532
|View full text |Cite
|
Sign up to set email alerts
|

Exploring Structural Sparsity in CNN via Selective Penalty

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 18 publications
(4 citation statements)
references
References 33 publications
0
4
0
Order By: Relevance
“…Later on, several techniques for compressing models were proposed. These techniques discard certain connections or features within the networks to reduce unnecessary parameters and computational expenses [20][21][22][23].…”
Section: Model Compression Techniquesmentioning
confidence: 99%
See 2 more Smart Citations
“…Later on, several techniques for compressing models were proposed. These techniques discard certain connections or features within the networks to reduce unnecessary parameters and computational expenses [20][21][22][23].…”
Section: Model Compression Techniquesmentioning
confidence: 99%
“…The model pruning approach [20] removes the less important parameters according to their impact on feature extraction performance. The model structural sparsity approach [21] learns sparse parameter matrices to reduce computational complexity during model optimisation. The weight quantisation approach [22] uses fewer bits to represent the network weights to reduce memory consumption.…”
Section: Model Compression Techniquesmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to improve the accuracy and quantity of vehicle detection, networks with higher accuracy are needed. However, due to the huge amount of network parameters and computation, these networks will bring high resource and computation costs [ 15 ]. How to reduce the parameter amount and algorithm complexity to improve the acceleration performance becomes one of the difficulties.…”
Section: Introductionmentioning
confidence: 99%