2021
DOI: 10.1109/tpami.2019.2932062
|View full text |Cite
|
Sign up to set email alerts
|

Switchable Normalization for Learning-to-Normalize Deep Representation

Abstract: We address a learning-to-normalize problem by proposing Switchable Normalization (SN), which learns to select different normalizers for different normalization layers of a deep neural network. SN employs three distinct scopes to compute statistics (means and variances) including a channel, a layer, and a minibatch. SN switches between them by learning their importance weights in an end-to-end manner. It has several good properties. First, it adapts to various network architectures and tasks (see Fig. 1). Secon… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
44
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 68 publications
(44 citation statements)
references
References 41 publications
0
44
0
Order By: Relevance
“…(6). The optimization of binary variables has been well established in the literature [22,18,17,26], which can be also used to train DGConv. The gate params are optimized by Straight-Through Estimator similar to recent network quantization approaches, which is guaranteed to converge [5].…”
Section: Construction Of the Relationship Matrixmentioning
confidence: 99%
“…(6). The optimization of binary variables has been well established in the literature [22,18,17,26], which can be also used to train DGConv. The gate params are optimized by Straight-Through Estimator similar to recent network quantization approaches, which is guaranteed to converge [5].…”
Section: Construction Of the Relationship Matrixmentioning
confidence: 99%
“…In this case, BN is prone to introduce random noise during the training process. To solve this problem, we firstly introduce Switchable normalization (SN) [18] to person search, which is proposed to learn different operations in different normalization layers of a deep neural network. The candidate normalizer of SN includes Instance Normalization (IN) [23], Layer Normalization (LN) [24] and Batch Normalization (BN) [22], which estimates channel-wise, layer-wise and minibatch-wise statistics separately.…”
Section: Feature Extractormentioning
confidence: 99%
“…ω in (k ∈ {in, ln, bn}) are important weights learned by SN during training. More details refer to [18].…”
Section: Feature Extractormentioning
confidence: 99%
“…Therefore, we combined the up‐sampling layer and the convolutional block to realise up‐sampling. Each convolutional block architecture uses the same combination of a convolutional layer and switchable normalisation [27] followed by the leaky rectified linear units activation layer. SN is the improved combination of batch normalisation, instance normalisation and layer normalisation, and it can learn to select different normalisers for different normalisation layers of a deep neural network.…”
Section: Underwater Image Colour Transfer Ganmentioning
confidence: 99%