Proceedings of the 27th ACM International Conference on Multimedia 2019
DOI: 10.1145/3343031.3350534
|View full text |Cite
|
Sign up to set email alerts
|

daBNN

Abstract: It is always well believed that Binary Neural Networks (BNNs) could drastically accelerate the inference efficiency by replacing the arithmetic operations in float-valued Deep Neural Networks (DNNs) with bit-wise operations. Nevertheless, there has not been open-source implementation in support of this idea on low-end ARM devices (e.g., mobile phones and embedded devices). In this work, we propose daBNN -a super fast inference framework that implements BNNs on ARM devices. Several speed-up and memory refinemen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 46 publications
(9 citation statements)
references
References 9 publications
0
9
0
Order By: Relevance
“…There are some inference frameworks for BNN, some of them are open-source software frameworks like BMXNet [222], BMXNet 2 [223], daBNN [224], Riptide [225], and Larq [226], and the others are not available to the public like BitStream [227]. This section discusses only the open-source software frameworks.…”
Section: ) Other Frameworkmentioning
confidence: 99%
See 1 more Smart Citation
“…There are some inference frameworks for BNN, some of them are open-source software frameworks like BMXNet [222], BMXNet 2 [223], daBNN [224], Riptide [225], and Larq [226], and the others are not available to the public like BitStream [227]. This section discusses only the open-source software frameworks.…”
Section: ) Other Frameworkmentioning
confidence: 99%
“…daBNN [224] is a fast BNN inference framework for ARM devices, released under the BSD license. daBNN utilized the following methods to speed the inference: upgraded bitpacking scheme, direct binary convolution, and novel memory layout to decrease memory access.…”
Section: ) Other Frameworkmentioning
confidence: 99%
“…BitFlow provides 1.8× speedup compared to naïve binary convolution. In this line, DaBnn is an efficient inference framework for binary neural networks on mobile platforms powered by ARM processors [50]. Our work is the first one studying the efficiency of low-bits quantized neural networks on CPU.…”
Section: Related Workmentioning
confidence: 99%
“…BNNs are suitable for working in limited-resource devices, improving the inference speed. However, in contraposition, BNNs may lose accuracy and precision, in particular when both activation and weights are binary [54][55][56]. DEXiRE employs binary neural networks to discretize only the neuron activations, based on Hypothesis 1, identifying active neurons, and making easier the induction of Boolean functions [57].…”
Section: Underlying Rational Designmentioning
confidence: 99%