2020 IEEE International Conference on Information Technology,Big Data and Artificial Intelligence (ICIBA) 2020
DOI: 10.1109/iciba50161.2020.9277078
|View full text |Cite
|
Sign up to set email alerts
|

A Parasitic Mechanism-Based Filter Pruning Method for Deep Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…This has significantly reduced the overall computational complexity of the network. Xu Han et al [8] has proposed a Parasitic-Mechanism (PAM) based filter pruning. In first step, Parasitic Layer is constructed which intelligently learns unimportant filters during training process for pruning.…”
Section: A Network Compression Using Pruning Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…This has significantly reduced the overall computational complexity of the network. Xu Han et al [8] has proposed a Parasitic-Mechanism (PAM) based filter pruning. In first step, Parasitic Layer is constructed which intelligently learns unimportant filters during training process for pruning.…”
Section: A Network Compression Using Pruning Methodsmentioning
confidence: 99%
“…Additionally, pruning can reduce the number of weights by up to 90%, as described by Han et a. [8]. Despite the success of these two methods, there is still a need to design an efficient compression model that can reduce memory consumption while preserving the system's performance [9].…”
Section: Introductionmentioning
confidence: 99%