2022
DOI: 10.48550/arxiv.2204.10378
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Case for Transparent Reliability in DRAM Systems

Abstract: Mass-produced commodity DRAM is the preferred choice of main memory for a broad range of computing systems due to its favorable cost-per-bit. However, today's systems have diverse system-speci c needs (e.g., performance, energy, reliability) that are di cult to address using one-size-ts-all generalpurpose DRAM. Unfortunately, although system designers can theoretically adapt commodity DRAM chips to meet their particular design goals (e.g., by exploiting slack in access timings to improve performance, or implem… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 200 publications
(482 reference statements)
0
3
0
Order By: Relevance
“…This work was supported in part by the generous gifts provided by our industry partners, including Google, Huawei, Intel, Microsoft, and VMware, and support from the ETH Future Computing Laboratory and the Semiconductor Research Corporation. A much earlier version of this work was placed on arXiv in 2022 [418].…”
Section: Acknowledgmentmentioning
confidence: 99%
“…This work was supported in part by the generous gifts provided by our industry partners, including Google, Huawei, Intel, Microsoft, and VMware, and support from the ETH Future Computing Laboratory and the Semiconductor Research Corporation. A much earlier version of this work was placed on arXiv in 2022 [418].…”
Section: Acknowledgmentmentioning
confidence: 99%
“…Technologies such as Non-Volatile Memory Express (NVMe) (Lersch, 2021) and new DRAM alternatives (Patel et al, 2022) are pushing the boundaries of memory capacity and speed, further enhancing the capabilities of inmemory systems. Moreover, cloud providers like Amazon Web Services, Google Cloud Platform, and Microsoft Azure are incorporating in-memory technologies into their offerings, making them accessible to a broader range of businesses.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Modern memory-intensive workloads have increasing memory bandwidth, latency, and capacity requirements. However, DRAM vendors often prioritize memory capacity scaling over latency and bandwidth [1][2][3][4]. As a result, main memory is an increasingly worsening bottleneck in computing systems [3,[5][6][7][8][9].…”
Section: Introductionmentioning
confidence: 99%