Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2020
DOI: 10.48550/arxiv.2010.11820
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Posterior Re-calibration for Imbalanced Datasets

Abstract: Neural Networks can perform poorly when the training label distribution is heavily imbalanced, as well as when the testing data differs from the training distribution. In order to deal with shift in the testing label distribution, which imbalance causes, we motivate the problem from the perspective of an optimal Bayes classifier and derive a post-training prior rebalancing technique that can be solved through a KL-divergence based optimization. This method allows a flexible post-training hyper-parameter to be … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 12 publications
(30 reference statements)
0
2
0
Order By: Relevance
“…, Jamal et al [2020], improved the performance on the rare examples by explicitly encourages transfer learning. Re-calibration methods adjust the logits of the outputs with re-weighting [Tian et al, 2020, Menon et al, 2021.…”
Section: Supervised Learning With Dataset Imbalancementioning
confidence: 99%
See 1 more Smart Citation
“…, Jamal et al [2020], improved the performance on the rare examples by explicitly encourages transfer learning. Re-calibration methods adjust the logits of the outputs with re-weighting [Tian et al, 2020, Menon et al, 2021.…”
Section: Supervised Learning With Dataset Imbalancementioning
confidence: 99%
“…The performance of vanilla supervised methods degrades significantly on class-imbalanced datasets [Cui et al, 2019, Cao et al, 2019, Buda et al, 2018, posing challenges to practical applications such as instance segmentation and depth estimation . Many recent works address this issue with various regularization and re-weighting/re-sampling techniques [Ando and Huang, 2017, Wang et al, 2017b, Jamal et al, 2020, Cui et al, 2019, Cao et al, 2019, Tian et al, 2020, Hong et al, 2021.…”
Section: Introductionmentioning
confidence: 99%