2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019
DOI: 10.1109/cvpr.2019.00050
|View full text |Cite
|
Sign up to set email alerts
|

Structured Binary Neural Networks for Accurate Image Classification and Semantic Segmentation

Abstract: In this paper, we propose to train convolutional neural networks (CNNs) with both binarized weights and activations, leading to quantized models specifically for mobile devices with limited power capacity and computation resources. Previous works on quantizing CNNs seek to approximate the floating-point information using a set of discrete values, which we call value approximation, but typically assume the same architecture as the full-precision networks.In this paper, however, we take a novel "structure approx… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
96
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 154 publications
(106 citation statements)
references
References 39 publications
1
96
0
Order By: Relevance
“…But it also should be noted that the binary models still face a great challenge, especially when the activations are quantized to 1-bit. For semantic segmentation task, as shown in Table 7, the very recent method [122] achieved high accuracy using only 1-bit, which is almost the same as the full-precision model. But it is unknown how it works and the actual speed up of that method still needs to be verified.…”
Section: Other Tasksmentioning
confidence: 86%
See 1 more Smart Citation
“…But it also should be noted that the binary models still face a great challenge, especially when the activations are quantized to 1-bit. For semantic segmentation task, as shown in Table 7, the very recent method [122] achieved high accuracy using only 1-bit, which is almost the same as the full-precision model. But it is unknown how it works and the actual speed up of that method still needs to be verified.…”
Section: Other Tasksmentioning
confidence: 86%
“…Thus the binary neural networks are also applied to other common tasks such as object detection and semantic segmentation. in terms of accuracy and major computation savings [122]. In [123], SeerNet considers feature-map sparsity through low-bit quantization, which is applicable to general convolutional neural networks and tasks.…”
Section: Applications Of Binary Neural Networkmentioning
confidence: 99%
“…Group-Net [24] also improves the performance of 1-bit CNNs through structural design. Inspired by the fact that a floating point number in a computer is represented by a fixed-number of binary digits, Group-Net proposes to decompose a network into binary structures while preserving its representability, rather than directly doing the quantization via "value decomposition".…”
Section: Structural Designmentioning
confidence: 99%
“…Since then, the field of BNNs and closely related Ternary Neural Networks has become a prime candidate to enable efficient inference for deep neural networks. Numerous papers have explored novel architectures (Liu et al, 2018;Rastegari et al, 2016;Zhu, Dong, & Su, 2019;Zhuang, Shen, Tan, Liu, & Reid, 2019) and optimization strategies (Alizadeh, Fernández-Marqués, Lane, & Gal, 2019;Helwegen et al, 2019), and the accuracy gap between efficient BNNs and regular DNNs is rapidly closing.…”
Section: Background: Neural Network Binarizationmentioning
confidence: 99%