2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2020
DOI: 10.1109/cvpr42600.2020.00763
|View full text |Cite
|
Sign up to set email alerts
|

Rethinking Class-Balanced Methods for Long-Tailed Visual Recognition From a Domain Adaptation Perspective

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
153
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 206 publications
(176 citation statements)
references
References 25 publications
3
153
0
Order By: Relevance
“…Denoting the two domains as D 𝑢 and D 𝑏 , their distributions in feature space are defined as D𝑢 and D𝑏 , respectively. Notably, we not only consider the long-tailed recognition problem as a domain adaptation task, analogous to [14], but also propose to define and slack the generalization error bound.…”
Section: Methodology 31 Modelingmentioning
confidence: 99%
See 2 more Smart Citations
“…Denoting the two domains as D 𝑢 and D 𝑏 , their distributions in feature space are defined as D𝑢 and D𝑏 , respectively. Notably, we not only consider the long-tailed recognition problem as a domain adaptation task, analogous to [14], but also propose to define and slack the generalization error bound.…”
Section: Methodology 31 Modelingmentioning
confidence: 99%
“…The real-world data, however, typically follows longtailed distributions [7,9] To adapt the models trained upon long-tailed distributions to uniform distributions, re-sampling and re-weighting methods [3,6] have been explored to re-balance the data distributions. Recently, feature-classifier decoupling learning strategies [14,16,23,33] are proposed to learn representation and classifiers, step by step. Despite of the effectiveness, these approaches require complicated training strategies and/or work in empirical fashions, which hinder the interpretability and further progress of long-tailed recognition problems.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In dealing with the problem of dataset imbalance, we can also learn from other learning strategies. With meta learning (domain adaptation), minor categories and major categories are processed differently to learn how to reweight adaptively (Shu et al, 2019 ), or to formulate domain adaptation problems (Jamal et al, 2020 ). Metric learning essentially models the boundary/margin near minor categories, with the aim of learning better embedding (Huang et al, 2016 ; Zhang et al, 2017 ).…”
Section: Related Workmentioning
confidence: 99%
“…Instead of hacking the prior distribution of classes, focusing on the hard samples also alleviates the long-tailed issue. Jamal et al [174] adopt meta-learning to dynamically learn the weights for each instance by iteratively optimize an inner loop. Li et al [175] enhance the hard samples of by group softmax.…”
Section: Hard Example Miningmentioning
confidence: 99%