2020
DOI: 10.1007/978-3-030-58548-8_10
|View full text |Cite
|
Sign up to set email alerts
|

Distribution-Balanced Loss for Multi-label Classification in Long-Tailed Datasets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
115
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 175 publications
(133 citation statements)
references
References 28 publications
0
115
0
Order By: Relevance
“…a low number of paintings), and then predict them to any given painting. Investigating different loss functions (such as focal loss [20] or distribution-balanced loss [40]) may help mitigate this issue.…”
Section: Discussionmentioning
confidence: 99%
“…a low number of paintings), and then predict them to any given painting. Investigating different loss functions (such as focal loss [20] or distribution-balanced loss [40]) may help mitigate this issue.…”
Section: Discussionmentioning
confidence: 99%
“…Besides, semi-supervised learning [30] and zero-shot learning [31] are adopted for solving the problem of high cost of data labeling. The influence of loss function [32], [33] on multi-label classification is also worth studying. In [32], label-occurrence is calculated and introduced into the loss function to deal with the imbalance between positive and negative training samples.…”
Section: Related Workmentioning
confidence: 99%
“…The influence of loss function [32], [33] on multi-label classification is also worth studying. In [32], label-occurrence is calculated and introduced into the loss function to deal with the imbalance between positive and negative training samples. In [33], the binary cross-entropy function is improved by introducing scalable neighbor discriminative loss to embed a graph structure into the network.…”
Section: Related Workmentioning
confidence: 99%
“…There have been several studies that address the problem in the aspect of class imbalance. [20] alleviates data imbalance of relatively less appeared labels in multi-label samples. This suggests a re-sampling method such as [15], [16] to make class distribution uniform.…”
Section: Related Workmentioning
confidence: 99%
“…In other words, contextual bias tendency due to co-occurring becomes worse in multi-label data. In order to mitigate the co-occurring bias in multi-label data, there have been methods that reduce correlation between cooccurring instance [20] or positive-negative imbalance [35] by modifying loss function. [5] present a method of using class activation map (CAM) to reduce the CAM similarity of different objects for co-occurring samples.…”
Section: Related Workmentioning
confidence: 99%