2020 IEEE International Symposium on Circuits and Systems (ISCAS) 2020
DOI: 10.1109/iscas45731.2020.9181239
|View full text |Cite
|
Sign up to set email alerts
|

Learning Low-Rank Structured Sparsity in Recurrent Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

9
1,114
1
2

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 644 publications
(1,138 citation statements)
references
References 9 publications
9
1,114
1
2
Order By: Relevance
“…For the former, the fluctuation of pruning ratio with the wide range change of hyperparameters in pruning LeNet models on MNIST dataset are investigated and then the parameters selection principle is summarised. For the latter, the proposed method is compared with some state‐of‐the‐art algorithms on CIFAR100 dataset, such as Liu's [24], ThiNet [15], SSL [23], SSR [31] and GM [20].…”
Section: Methodsmentioning
confidence: 99%
“…For the former, the fluctuation of pruning ratio with the wide range change of hyperparameters in pruning LeNet models on MNIST dataset are investigated and then the parameters selection principle is summarised. For the latter, the proposed method is compared with some state‐of‐the‐art algorithms on CIFAR100 dataset, such as Liu's [24], ThiNet [15], SSL [23], SSR [31] and GM [20].…”
Section: Methodsmentioning
confidence: 99%
“…In the real‐valued case, a classical choice for the regulariser r)( in (6) is the 2 norm. Whenever sparsity is desired, it can be replaced with the 1 norm, or a proper group version is acting at a neuron‐level [16, 17]. In most implementations of QVNNs, these regularisers are applied element‐wise on the four components of each quaternion weight.…”
Section: Targeted Regularisation For Qvnnsmentioning
confidence: 99%
“…In contrast, structured pruning enables to reduce the number of kernels or filters from CNNs [12]- [14] by using group sparse regularization such as group lasso [15], [16], whose method is widely used to pruning unnecessary kernels or filters keeping high performance.…”
Section: Arxiv:201102389v1 [Cscv] 4 Nov 2020mentioning
confidence: 99%