2020
DOI: 10.48550/arxiv.2009.13333
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Group Whitening: Balancing Learning Efficiency and Representational Capacity

Abstract: Batch normalization (BN) is an important technique commonly incorporated into deep learning models to perform standardization within mini-batches. The merits of BN in improving model's learning efficiency can be further amplified by applying whitening, while its drawbacks in estimating population statistics for inference can be avoided through group normalization (GN). This paper proposes group whitening (GW), which elaborately exploits the advantages of the whitening operation and avoids the disadvantages of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 36 publications
0
1
0
Order By: Relevance
“…Whitening is always used as a preprocessing method [18]. In recent years, whitening has had many applications on neural networks, including normalization [14,15,33], generative adversarial networks [40], and self-supervised learning [9]. In this work, we firstly explore its application on long-tailed recognition.…”
Section: Related Workmentioning
confidence: 99%
“…Whitening is always used as a preprocessing method [18]. In recent years, whitening has had many applications on neural networks, including normalization [14,15,33], generative adversarial networks [40], and self-supervised learning [9]. In this work, we firstly explore its application on long-tailed recognition.…”
Section: Related Workmentioning
confidence: 99%