Proceedings of the 26th ACM International Conference on Architectural Support for Programming Languages and Operating Systems 2021
DOI: 10.1145/3445814.3446723
|View full text |Cite
|
Sign up to set email alerts
|

Warehouse-scale video acceleration: co-design and deployment in the wild

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 34 publications
(7 citation statements)
references
References 46 publications
0
6
0
1
Order By: Relevance
“…A second example further illustrates “hardware-software co-design” (Ranganathan et al, 2021), that is, the close coordination between hardware and software components around specific application requirements. To deal with the vast number of uploaded videos and the increasing diversity of viewing devices, YouTube has designed Video Coding Units (VCUs) that can transcode files to different formats in parallel, using a “multiple-output transcoding” approach where certain processing steps are shared to increase throughput and efficiency (Ranganathan et al, 2021). Google also uses these VCUs for their cloud gaming service Stadia, again showing how product variety enables synergies that make costly investments in cutting edge technologies viable.…”
Section: Google As Technical Systemmentioning
confidence: 99%
“…A second example further illustrates “hardware-software co-design” (Ranganathan et al, 2021), that is, the close coordination between hardware and software components around specific application requirements. To deal with the vast number of uploaded videos and the increasing diversity of viewing devices, YouTube has designed Video Coding Units (VCUs) that can transcode files to different formats in parallel, using a “multiple-output transcoding” approach where certain processing steps are shared to increase throughput and efficiency (Ranganathan et al, 2021). Google also uses these VCUs for their cloud gaming service Stadia, again showing how product variety enables synergies that make costly investments in cutting edge technologies viable.…”
Section: Google As Technical Systemmentioning
confidence: 99%
“…Several cloud providers already integrate ASIC accelerators in their datacenters. For example, Google Cloud, Microsoft Azure and Amazon AWS use dedicated hardware for neural network inference (Google Cloud TPU [184,185], Habana Goya [186]), neural network training (AWS Trainium [187], Habana Gaudi [186]), video transcoding (Google VCU [188]), and compression/encryption/data authentication (Microsoft's Project Corsica [189]). Recently, different commercial PIM designs [95-98, 108-111, 190], which target large cloud systems, have been proposed.…”
Section: Integrating Polynesia In the Cloudmentioning
confidence: 99%
“…Polynesia's energy savings ( §10.6) are also attractive to a cloud environment, as it can significantly lower operating costs. As prior works show [188,191], ASIC accelerators are a viable solution for cloud environments, depending on the scale of the computation. Even though we cannot accurately predict the price of integrating Polynesia into a cloud system due to unknown parameters (e.g., non recurring engineering expenses), we believe it would be a beneficial solution for cloud systems due to the widespread use of the database applications it targets.…”
Section: Integrating Polynesia In the Cloudmentioning
confidence: 99%
“…Because the fixed function hardware cores are nonfungible, we moved from a uniform CPU cost scheduler to an online multidimensional bin packing scheduler [shown in Sec. 3.3.3 of the work by Ranganathan et al 3 ] to ensure no single VCU becomes completely saturated and no encoder cores become starved.…”
Section: Cluster and Warehouse-scale Schedulingmentioning
confidence: 99%