2020
DOI: 10.48550/arxiv.2006.14085
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Topological Insights into Sparse Neural Networks

Abstract: Sparse neural networks are effective approaches to reduce the resource requirements for the deployment of deep neural networks. Recently, the concept of adaptive sparse connectivity, has emerged to allow training sparse neural networks from scratch by optimizing the sparse structure during training. However, comparing different sparse topologies and determining how sparse topologies evolve during training, especially for the situation in which the sparse structure optimization is involved, remain as challengin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 23 publications
(39 reference statements)
0
1
0
Order By: Relevance
“…Works from [5,32] also show that the sparse training achieves better performance than iteratively pruning a pre-trained dense model and static sparse neural networks. Moreover, Liu et al [23] demonstrated that there is a plenitude of sparse sub-networks with very different topologies that achieve the same performance.…”
Section: Spacenet Approach For Continual Learningmentioning
confidence: 99%
“…Works from [5,32] also show that the sparse training achieves better performance than iteratively pruning a pre-trained dense model and static sparse neural networks. Moreover, Liu et al [23] demonstrated that there is a plenitude of sparse sub-networks with very different topologies that achieve the same performance.…”
Section: Spacenet Approach For Continual Learningmentioning
confidence: 99%