2020 25th International Conference on Pattern Recognition (ICPR) 2021
DOI: 10.1109/icpr48806.2021.9412209
|View full text |Cite
|
Sign up to set email alerts
|

Compression strategies and space-conscious representations for deep neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(13 citation statements)
references
References 16 publications
0
9
0
Order By: Relevance
“…In this subsection we summarize our previous results obtained when compressing only FC layers via pruning, CWS, and PWS methods, considering each layer separately, that is when each layer has its own k distinct weights [30]. They serve as a base reference for the different analyses presented here.…”
Section: Preliminary Results From Previous Studiesmentioning
confidence: 99%
See 4 more Smart Citations
“…In this subsection we summarize our previous results obtained when compressing only FC layers via pruning, CWS, and PWS methods, considering each layer separately, that is when each layer has its own k distinct weights [30]. They serve as a base reference for the different analyses presented here.…”
Section: Preliminary Results From Previous Studiesmentioning
confidence: 99%
“…Due to its independence on the underlying architecture and their low complexity, weight sharing quantization has found a large application: here, the weights are first partitioned into multiple categories, then within each category a representative value is selected and used to replace all weights in that category. Such methods mainly differ in the way they subdivide the network weights, e.g., by means of clustering techniques [29], statistical methods [30,31], uniform schemes [32], or by minimizing the distortion and the entropy of the coded source [33]. We will describe these methods in detail in Section 3.…”
Section: Weight Quantizationmentioning
confidence: 99%
See 3 more Smart Citations