2019
DOI: 10.48550/arxiv.1903.03793
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SSN: Learning Sparse Switchable Normalization via SparsestMax

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 14 publications
0
4
0
Order By: Relevance
“…Normalization techniques in deep neural networks are originally designed for regularizing trained models and improving their generalization performance. Various normalization techniques [1,3,7,13,15,18,20,21,23,25] have been studied actively in recent years. The most popular technique is batch normalization (BN) [7], which normalizes activations over individual channels using data in a mini-batch while instance normalization (IN) [23] performs the same operation per instance instead of mini-batch.…”
Section: Normalization In Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Normalization techniques in deep neural networks are originally designed for regularizing trained models and improving their generalization performance. Various normalization techniques [1,3,7,13,15,18,20,21,23,25] have been studied actively in recent years. The most popular technique is batch normalization (BN) [7], which normalizes activations over individual channels using data in a mini-batch while instance normalization (IN) [23] performs the same operation per instance instead of mini-batch.…”
Section: Normalization In Neural Networkmentioning
confidence: 99%
“…Recently, batch-instance normalization (BIN) [18], switchable normalization (SN) [13], and sparse switchable normalization (SSN) [21] employ the combinations of multiple normalization types to maximize the benefit. Note that BIN considers batch and instance normalizations while SN uses LN additionally.…”
Section: Normalization In Neural Networkmentioning
confidence: 99%
“…To combine the advantages of multiple normalization techniques, Switchable Normalization (SN) (Luo et al, 2018) and Sparse Switchable Normalization (SSN) (Shao et al, 2019) make use of differentiable learning to switch among different normalization methods.…”
Section: Related Workmentioning
confidence: 99%
“…For example, SN [25] computes BN, IN, and LN at the same time and uses Au-toML [23] to determine how to combine them. SSN [39] uses SparsestMax to get sparse SN. DN [27] proposes a more flexible form to represent normalizations and finds better normalizations.…”
Section: Related Workmentioning
confidence: 99%