2022 IEEE High Performance Extreme Computing Conference (HPEC) 2022
DOI: 10.1109/hpec55821.2022.9926331
|View full text |Cite
|
Sign up to set email alerts
|

AI and ML Accelerator Survey and Trends

Abstract: This paper updates the survey of AI accelerators and processors from past three years. This paper collects and summarizes the current commercial accelerators that have been publicly announced with peak performance and power consumption numbers. The performance and power values are plotted on a scatter graph, and a number of dimensions and observations from the trends on this plot are again discussed and analyzed. Two new trends plots based on accelerator release dates are included in this year's paper, along w… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 36 publications
(16 citation statements)
references
References 47 publications
0
16
0
Order By: Relevance
“…However, despite all the advances, empirical evidence based on commercial AI accelerators indicate plateauing of energy efficiency at 10 TOPS/W or 100 fJ/Operation. 221 This is attributed to the energy costs associated with data movement, as indicated in Figure 18a. MVM operations cost tens of fJ whereas accessing on-chip SRAM costs about 1pJ/byte.…”
Section: Development Of Computing-in-memory Based Ai Acceleratorsmentioning
confidence: 99%
See 1 more Smart Citation
“…However, despite all the advances, empirical evidence based on commercial AI accelerators indicate plateauing of energy efficiency at 10 TOPS/W or 100 fJ/Operation. 221 This is attributed to the energy costs associated with data movement, as indicated in Figure 18a. MVM operations cost tens of fJ whereas accessing on-chip SRAM costs about 1pJ/byte.…”
Section: Development Of Computing-in-memory Based Ai Acceleratorsmentioning
confidence: 99%
“…Digital accelerators typically contain arrays of processor elements (PEs) that can perform several MVM operations in a pipeline-parallel fashion. However, despite all the advances, empirical evidence based on commercial AI accelerators indicate plateauing of energy efficiency at 10 TOPS/W or 100 fJ/Operation . This is attributed to the energy costs associated with data movement, as indicated in Figure a.…”
Section: Computing-in-memory System For Ai Acceleratorsmentioning
confidence: 99%
“…Litz wire has also been found particularly suitable for constructing ULF rf receiver coil (68)(69)(70). On the other hand, new computational strategies will be continuously developed to advance ULF image quality (e.g., higher spatial resolution by higher SR factor) by exploiting rapidly evolving DL algorithms and architectures as well as ever increasing computing power and large-scale human MRI data availability (59,71,72). Future efforts shall also encompass the experimental assessment and optimization of ULF data acquisition and DL image reconstruction to yield optimal trade-offs between image fidelity, resolution, contrast, scan time, and cost for each specific applications.…”
Section: Challengesmentioning
confidence: 99%
“…Introduction of interaction between weights is not strictly speaking necessary to satisfy the condition (5). Since magnetic moments directions are subjected to normalization condition S 2 j = 1, the energy functional is always bounded, hence it has at least one minimum.…”
Section: Spin Boltzmann Machinementioning
confidence: 99%
“…At the moment computers can solve many artificial intelligence (AI) tasks, however the energy efficiency of the organic brain is still much higher. The need to reduce energy consumption leads to the emergence of dedicated AI accelerators [5]. The widespread digital computers are good for precise computations, but analog devices and stochastic computing better suit the needs of ML [6].…”
Section: Introductionmentioning
confidence: 99%