2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019
DOI: 10.1109/cvpr.2019.00075
|View full text |Cite
|
Sign up to set email alerts
|

Generalized Intersection Over Union: A Metric and a Loss for Bounding Box Regression

Abstract: Intersection over Union (IoU) is the most popular evaluation metric used in the object detection benchmarks. However, there is a gap between optimizing the commonly used distance losses for regressing the parameters of a bounding box and maximizing this metric value. The optimal objective for a metric is the metric itself. In the case of axisaligned 2D bounding boxes, it can be shown that IoU can be directly used as a regression loss. However, IoU has a plateau making it infeasible to optimize in the case of n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
1,481
0
10

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 3,292 publications
(1,525 citation statements)
references
References 24 publications
5
1,481
0
10
Order By: Relevance
“…The class which has more samples in a dataset or mini batch during training in the context of class imbalance. [17] and in KL loss [54] for Smooth L1 Loss), while some methods such as GIoU Loss [55] directly predict the bounding box coordinates. For the sake of clarity, we usex to denote the regression loss input for any method.…”
Section: Over-represented Classmentioning
confidence: 99%
See 1 more Smart Citation
“…The class which has more samples in a dataset or mini batch during training in the context of class imbalance. [17] and in KL loss [54] for Smooth L1 Loss), while some methods such as GIoU Loss [55] directly predict the bounding box coordinates. For the sake of clarity, we usex to denote the regression loss input for any method.…”
Section: Over-represented Classmentioning
confidence: 99%
“…For example, in AP Loss, smooth L1, which is in the logarithmic range (since the input to the loss is conventionally provided after applying a logarithmic transformation) with [0, ∞), is used for regression while L AP ∈ [0, 1]. Another example is the GIoU Loss [55], which is in the [−1, 1] range and used together with the cross entropy loss. The authors set the weighting factor of GIoU Loss to 10 and regularization is exploited to balance this range difference and ensure balanced training.…”
Section: Imbalance 4: Objective Imbalancementioning
confidence: 99%
“…Some focus on optimizing feature extraction ability of backbone networks [35,56,46,48]. Some methods improve performance by modifying certain metrics, such as the loss function in RetinaNet [36] and a new IoU defination given in GIoU [57]. With a new measuring metrics introduced in [26] and a comprehensive analysis performed after 2017, detection for objects with different scales becomes a major focus [48,58,59,37].…”
Section: Other Detector Architecturesmentioning
confidence: 99%
“…Modifying evaluation metrics could result in significant accuracy increase. Rezatofighi et al [57] in 2019 proposed Generalized Intersection over Union (GIoU) to replace the current IoU and used GIoU as the loss function. Before the study of [57], all CNN detectors calculate IoU during the test to evaluate the result, while employs other metrics as loss function during training to optimize.…”
Section: Retinanetmentioning
confidence: 99%
See 1 more Smart Citation