2024 IEEE International Solid-State Circuits Conference (ISSCC) 2024
DOI: 10.1109/isscc49657.2024.10454556
|View full text |Cite
|
Sign up to set email alerts
|

34.4 A 3nm, 32.5TOPS/W, 55.0TOPS/mm2 and 3.78Mb/mm2 Fully-Digital Compute-in-Memory Macro Supporting INT12 × INT12 with a Parallel-MAC Architecture and Foundry 6T-SRAM Bit Cell

Hidehiro Fujiwara,
Haruki Mori,
Wei-Chang Zhao
et al.
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 4 publications
0
1
0
Order By: Relevance
“…They realize an ultra-low-power on-device inference system by hybrid multiplication/accumulation units, speculative decoding, and implicit weight generation, reducing external memory access (EMA) by 74%−81%. In addition, a special forum named Energy-Efficient AI-Computing Systems for Large-Language Models shares more practical thoughts about large language model (LLM) computing systems [9] . Georgia Institute of Technology, NVIDIA, Intel, Google, KAIST, Samsung, Axelera AI, and MediaTek introduce their latest research over LLM training and inference in both cloud and edge.…”
Section: Trend 1: ML Chips For Generative Aimentioning
confidence: 99%
“…They realize an ultra-low-power on-device inference system by hybrid multiplication/accumulation units, speculative decoding, and implicit weight generation, reducing external memory access (EMA) by 74%−81%. In addition, a special forum named Energy-Efficient AI-Computing Systems for Large-Language Models shares more practical thoughts about large language model (LLM) computing systems [9] . Georgia Institute of Technology, NVIDIA, Intel, Google, KAIST, Samsung, Axelera AI, and MediaTek introduce their latest research over LLM training and inference in both cloud and edge.…”
Section: Trend 1: ML Chips For Generative Aimentioning
confidence: 99%