2021
DOI: 10.1016/j.patcog.2021.107899
|View full text |Cite
|
Sign up to set email alerts
|

Pruning by explaining: A novel criterion for deep neural network pruning

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
67
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 149 publications
(87 citation statements)
references
References 22 publications
1
67
0
Order By: Relevance
“…An alternative to backpropagation of gradients is Layer-wise Relevance Propagation (LRP) [49]. LRP has successfully been used as a saliency metric for pruning [50]. Similar to gradient backpropagation and NISP, LRP propagates information recursively from the last (output) layer of the network to its first (input) layer.…”
Section: Average Of Gradient [47]mentioning
confidence: 99%
“…An alternative to backpropagation of gradients is Layer-wise Relevance Propagation (LRP) [49]. LRP has successfully been used as a saliency metric for pruning [50]. Similar to gradient backpropagation and NISP, LRP propagates information recursively from the last (output) layer of the network to its first (input) layer.…”
Section: Average Of Gradient [47]mentioning
confidence: 99%
“…• LRP (Layer-wise Relevance Propagation) [233]. The LRP of each channel i is calculated as its importance score ψ i = 1 N ∑ ∑(LRP(i, :, :)), where N denotes the size of data; LRP calculates the summed relevance quantity of each channel in the network to the overall classification score, decomposing a classification decision into contributions for each channels.…”
Section: Different Filter Selection Criteriamentioning
confidence: 99%
“…The most relevant work is a CNN pruning method inspired by neural network interpretability. Yeom et al [233] combined the two disconnected research lines of interpretability and model compression by basing a pruning method on layer-wise relevance propagation (LRP) [104], where weights or filters are pruned based on their relevance score.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations