2021
DOI: 10.48550/arxiv.2108.13996
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Quantization of Generative Adversarial Networks for Efficient Inference: a Methodological Study

Abstract: Generative adversarial networks (GANs) have an enormous potential impact on digital content creation, e.g., photo-realistic digital avatars, semantic content editing, and quality enhancement of speech and images. However, the performance of modern GANs comes together with massive amounts of computations performed during the inference and high energy consumption. That complicates, or even makes impossible, their deployment on edge devices. The problem can be reduced with quantization-a neural network compressio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 23 publications
0
1
0
Order By: Relevance
“…Simulation results show the superiority of the authors' method over existing update compression techniques in terms of update size and on inference accuracy. Andreev et al [110] conducted an experimental study of post-training quantization and quantization-aware training techniques for three different GAN structures and achieved successful quantization of 4/8bits for these models.…”
Section: ) Model Compressionmentioning
confidence: 99%
“…Simulation results show the superiority of the authors' method over existing update compression techniques in terms of update size and on inference accuracy. Andreev et al [110] conducted an experimental study of post-training quantization and quantization-aware training techniques for three different GAN structures and achieved successful quantization of 4/8bits for these models.…”
Section: ) Model Compressionmentioning
confidence: 99%