2020
DOI: 10.1109/lsp.2020.3031490
|View full text |Cite
|
Sign up to set email alerts
|

Quantizing Oriented Object Detection Network via Outlier-Aware Quantization and IoU Approximation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 8 publications
0
4
0
Order By: Relevance
“…Table 1 summarizes some classical post-training quantization schemes. Early PTQ focused on minimizing the quantization error of network parameters through techniques such as optimizing quantization factor scale [21,22], bias correction [27,28], piecewise linear quantization [29,30], and outlier separation [31,32]. For example, Nvidia's TensorRT [22], a widely used quantization tool, searched for the optimal quantization factor scale by minimizing the Kullback-Leibler (KL) distance between FP activation and quantized activation.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Table 1 summarizes some classical post-training quantization schemes. Early PTQ focused on minimizing the quantization error of network parameters through techniques such as optimizing quantization factor scale [21,22], bias correction [27,28], piecewise linear quantization [29,30], and outlier separation [31,32]. For example, Nvidia's TensorRT [22], a widely used quantization tool, searched for the optimal quantization factor scale by minimizing the Kullback-Leibler (KL) distance between FP activation and quantized activation.…”
Section: Related Workmentioning
confidence: 99%
“…The Quantization Error of Network Parameters Optimizing Quantization Factor Scale [21,22] Bias Correction [27,28] Piecewise Linear Quantization [29,30] Outlier Separation [31,32] ≤4 bit Layer-wise Reconstruction LAPQ [23] AdaRound [24] AdaQuant [35] Block-wise Reconstruction BrecQ [25] RAPQ [33] Mr.BiQ [34] Qdrop [26] 3. Background and Theoretical Analysis 3.1.…”
Section: ≥6 Bitmentioning
confidence: 99%
“…Both too many and too few anchors can lead to poor results, and excessive anchors can also increase computational complexity. Unfortunately, these algorithms use NMS during the detection process, rather than all edge devices supporting NMS (such as edge computing devices that only support integer operations) [37]. In order to solve the above problems and abandon manual intervention and the application of prior knowledge, researchers have begun to turn their attention to transformer-based DETR.…”
Section: Related Workmentioning
confidence: 99%
“…Both too many and too few anchors lead to poor results, and excessive anchors also increase computational complexity. Unfortunately, these algorithms use NMS during the detection process, rather than all edge devices supporting NMS (such as edge computing devices that only support integer operations) [44]. In order to solve the above problems and abandon manual intervention and the application of prior knowledge, researchers have begun to turn their attention to transformer-based DETR.…”
Section: Related Workmentioning
confidence: 99%