2020
DOI: 10.1109/access.2020.2991314
|View full text |Cite
|
Sign up to set email alerts
|

High-Throughput Variable-to-Fixed Entropy Codec Using Selective, Stochastic Code Forests

Abstract: Efficient high-throughput (HT) compression algorithms are paramount to meet the stringent constraints of present and upcoming data storage, processing, and transmission systems. In particular, latency, bandwidth and energy requirements are critical for those systems. Most HT codecs are designed to maximize compression speed, and secondarily to minimize compressed lengths. On the other hand, decompression speed is often equally or more critical than compression speed, especially in scenarios where decompression… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 53 publications
(97 reference statements)
0
3
0
Order By: Relevance
“…Even with our architecture, learned image encoding and decoding is still a highly GPU-bound algorithm on most platforms. Therefore, as future work, by combining our method with other research to optimize the neural model, even higher performance could be achieved; on the other hand, the introduction of higher performance entropy coders such as [16] and [19] could potentially further reduce latency on desktop platforms. Especially for context-based models, combining our architecture with works such as [17] and [18] enables more parallel execution and higher throughput to be achieved.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Even with our architecture, learned image encoding and decoding is still a highly GPU-bound algorithm on most platforms. Therefore, as future work, by combining our method with other research to optimize the neural model, even higher performance could be achieved; on the other hand, the introduction of higher performance entropy coders such as [16] and [19] could potentially further reduce latency on desktop platforms. Especially for context-based models, combining our architecture with works such as [17] and [18] enables more parallel execution and higher throughput to be achieved.…”
Section: Discussionmentioning
confidence: 99%
“…On the other hand, works like [16] and [19] showed different approaches in implementing high-performance entropy coders. [17] and [18] demonstrated different methods enabling parallel entropy parameter calculations in context models, significantly boosting the serial masked CNN performance bottleneck.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation