2022
DOI: 10.48550/arxiv.2205.06287
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Adaptive Block Floating-Point for Analog Deep Learning Hardware

Abstract: Analog mixed-signal (AMS) devices promise faster, more energy-efficient deep neural network (DNN) inference than their digital counterparts. However, recent studies show that DNNs on AMS devices with fixed-point numbers can incur an accuracy penalty because of precision loss. To mitigate this penalty, we present a novel AMS-compatible adaptive block floating-point (ABFP) number representation. We also introduce amplification (or gain) as a method for increasing the accuracy of the number representation without… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…In our work, we use the BFP number format, which provides a middle ground between FXP and FP number formats. BFP has been used for DNN inference [7], [13], [46] and training [17], [59] as it is less costly than the FP formats and achieves better accuracy than the FXP formats with the same bit-width. BFP format splits tensors into groups and assigns an exponent to each group that is shared by the elements within the group.…”
Section: B Data Formats For Dnnsmentioning
confidence: 99%
See 1 more Smart Citation
“…In our work, we use the BFP number format, which provides a middle ground between FXP and FP number formats. BFP has been used for DNN inference [7], [13], [46] and training [17], [59] as it is less costly than the FP formats and achieves better accuracy than the FXP formats with the same bit-width. BFP format splits tensors into groups and assigns an exponent to each group that is shared by the elements within the group.…”
Section: B Data Formats For Dnnsmentioning
confidence: 99%
“…Combining BFP with analog hardware is a promising solution as BFP enables low-precision integer operations while providing a wider dynamic range than conventional integer arithmetic. This idea of using BFP formats for analog computing was first proposed by Basumallik et al [7] for DNN inference. In our work, we combine BFP and RNS to perform DNN training in photonic hardware.…”
Section: B Data Formats For Dnnsmentioning
confidence: 99%