2019
DOI: 10.48550/arxiv.1905.01416
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SinReQ: Generalized Sinusoidal Regularization for Low-Bitwidth Deep Quantized Training

Abstract: Deep quantization of neural networks (below eight bits) offers significant promise in reducing their compute and storage cost. Albeit alluring, without special techniques for training and optimization, deep quantization results in significant accuracy loss. To further mitigate this loss, we propose a novel sinusoidal regularization, called SinReQ, for deep quantized training. SinReQ adds a periodic term to the original objective function of the underlying training algorithm. SinReQ exploits the periodicity, di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 8 publications
(11 reference statements)
0
3
0
Order By: Relevance
“…Such an approach, however, does not lend itself well to non-binary representations and also non-linear quantization schemes. Elthakeb et al [9] alleviate the former issue by using a sinusoidal regularizer on the quantized weights.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Such an approach, however, does not lend itself well to non-binary representations and also non-linear quantization schemes. Elthakeb et al [9] alleviate the former issue by using a sinusoidal regularizer on the quantized weights.…”
Section: Related Workmentioning
confidence: 99%
“…There is no dependence on the use of STEs, which greatly improves ease of implementation. Also, compared to previous similar regularizer-based approaches [9,12,33], since in QGT the regularizer is applied on the weight values directly rather than the quantized values, there is no need to learn the scale of the quantized weights separately. Using regularizers, QGT can enforce properties such as clustering of weight values into quantized bins, which can accommodate non-linear, hardwarespecific quantizers.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation