2020
DOI: 10.1109/tns.2020.2977583
|View full text |Cite
|
Sign up to set email alerts
|

Impact of Tensor Cores and Mixed Precision on the Reliability of Matrix Multiplication in GPUs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 23 publications
(8 citation statements)
references
References 10 publications
0
7
1
Order By: Relevance
“…This happens regardless of the convolution type. Again, this is in contrast with data observed for GPUs, for which the magnitude of the error can be significantly higher (orders of magnitude) [11], [32]. This is another promising result for the TPU reliability in executing CNNs, as a higher error magnitude can have a greater impact on the output value.…”
Section: A Atomic Operationsmentioning
confidence: 65%
See 1 more Smart Citation
“…This happens regardless of the convolution type. Again, this is in contrast with data observed for GPUs, for which the magnitude of the error can be significantly higher (orders of magnitude) [11], [32]. This is another promising result for the TPU reliability in executing CNNs, as a higher error magnitude can have a greater impact on the output value.…”
Section: A Atomic Operationsmentioning
confidence: 65%
“…The fact that simple corrupted elements is the more common distribution for the TPU is in contrast with what has been observed for Graphics Processing Units (GPUs) [11], [31], [32], for which the majority of the corrupted matrices have multiple corrupted elements. This is due to the different way matrix multiplication is implemented.…”
Section: A Atomic Operationsmentioning
confidence: 75%
“…205 racy of such reduced or mixed-precision operations [14,18,5]. Several experiments are conducted later in this article to assess those effects on our convolution algorithm.…”
Section: Several Authors Have Explored the Arithmetic Accu-mentioning
confidence: 99%
“…Another study by Basso et al (2020) explores the reliability of tensor cores in terms of rate of hardware errors in matrix multiplications. The main finding is that low-precision operations and usage of tensor cores increase the amount of correct data produced by the GPU, despite increasing the impact of numerical errors due to the use of lower-precision data.…”
Section: Previous Workmentioning
confidence: 99%