2021 IEEE International Solid- State Circuits Conference (ISSCC) 2021
DOI: 10.1109/isscc42613.2021.9365862
|View full text |Cite
|
Sign up to set email alerts
|

25.4 A 20nm 6GB Function-In-Memory DRAM, Based on HBM2 with a 1.2TFLOPS Programmable Computing Unit Using Bank-Level Parallelism, for Machine Learning Applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
37
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 82 publications
(37 citation statements)
references
References 6 publications
0
37
0
Order By: Relevance
“…In-Memory Accelerators (IMA): Accelerators are placed within memory devices on the same silicon piece, either by placing logic between memory layers [47], or by taking advantage of the 3D-stacked integration technologies to accommodate NDP capabilities on the logic layer. Considering Single Data Rate (SDR) and Double Data Rate (DDR) memories, several techniques were proposed to process data inside these memories by integrating the processing logic into the DRAM row-buffers.…”
Section: Near Data Taxonomymentioning
confidence: 99%
See 2 more Smart Citations
“…In-Memory Accelerators (IMA): Accelerators are placed within memory devices on the same silicon piece, either by placing logic between memory layers [47], or by taking advantage of the 3D-stacked integration technologies to accommodate NDP capabilities on the logic layer. Considering Single Data Rate (SDR) and Double Data Rate (DDR) memories, several techniques were proposed to process data inside these memories by integrating the processing logic into the DRAM row-buffers.…”
Section: Near Data Taxonomymentioning
confidence: 99%
“…Other proposals integrate fine and coarse grain reconfigurable logic inside a logic layer [78,88]. Finally, several proposals integrate custom Application-Specific Integrated Circuits (ASICs) able to accelerate only specific applications [25,26,27,30,31,47,64,65,71].…”
Section: Near Data Taxonomymentioning
confidence: 99%
See 1 more Smart Citation
“…By separating logic die and memory dies, 3D-stacked memory PIM can provide higher computing resources than other PIM mod- els. In particular, a function-in-memory (FIM) DRAM implementation has been recently proposed for running machine learning applications using HBM memory systems [12]. With the advent of FIM DRAM, the machine learning development environment for PIM systems has become more critical.…”
Section: A Processing-in-memory Techniquementioning
confidence: 99%
“…Recently, Kwon et al proposed the real product deployment of an HBM-based PIM device, named function-inmemory (FIM) DRAM [12]. Compared to PIMCaffe, the processing elements (programmable computing unit, PCU) of FIMDRAM are located inside the DRAM bank, using Samsung's HBM2 fabrication technology.…”
Section: Related Workmentioning
confidence: 99%