2021
DOI: 10.1109/tpami.2019.2952096
|View full text |Cite
|
Sign up to set email alerts
|

Scalar Quantization as Sparse Least Square Optimization

Abstract: Quantization can be used to form new vectors/matrices with shared values close to the original. In recent years, the popularity of scalar quantization for value-sharing application has been soaring as it has been found huge utilities in reducing the complexity of neural networks. Existing clusteringbased quantization techniques, while being well-developed, have multiple drawbacks including the dependency of the random seed, empty or out-of-the-range clusters, and high time complexity for large number of cluste… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
references
References 35 publications
0
0
0
Order By: Relevance