2019
DOI: 10.1007/978-3-030-17227-5_27
|View full text |Cite
|
Sign up to set email alerts
|

Exploring Data Size to Run Convolutional Neural Networks in Low Density FPGAs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5
1

Relationship

5
1

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 16 publications
0
5
0
Order By: Relevance
“…During training, the data is typically in singleprecision floating-point format. For inference in FPGAs, the feature maps and kernels can be converted to fixed-point format with less precision, typically 8 or 16 bits, reducing the storage requirements, hardware utilisation and power consumption [23].…”
Section: B Object Detection With Yolomentioning
confidence: 99%
“…During training, the data is typically in singleprecision floating-point format. For inference in FPGAs, the feature maps and kernels can be converted to fixed-point format with less precision, typically 8 or 16 bits, reducing the storage requirements, hardware utilisation and power consumption [23].…”
Section: B Object Detection With Yolomentioning
confidence: 99%
“…During training, the data are typically in single-precision floating-point format. For inference in FPGAs, the feature maps and kernels are usually converted to fixedpoint format with less precision, typically 8 or 16 bits, reducing the storage requirements, hardware utilization, and power consumption [23].…”
Section: Cnn Model Optimizationmentioning
confidence: 99%
“…Data quantization methods reduce the complexity of arithmetic operators and the number of bits (bitwidth) to represent parameters and activations. The complexity of hardware implementations of arithmetic operations depends on the type of data [112]. Operators for floating-point arithmetic are more complex than for fixed-point or integer arithmetic.…”
Section: Hardware-oriented Deep Neural Network Optimizationsmentioning
confidence: 99%