2019 International Conference on Control, Artificial Intelligence, Robotics &Amp; Optimization (ICCAIRO) 2019
DOI: 10.1109/iccairo47923.2019.00032
|View full text |Cite
|
Sign up to set email alerts
|

Compressing Convolutional Neural Networks by L0 Regularization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…Score comparison can be performed locally [23], i.e., among the interconnections of the same DNN layer, or globally [10], i.e., among the interconnections of the whole DNN model. As a second criterion, a regularization term [24]- [26] can be imposed on the loss function during training to promote the sparsity of the structure.…”
Section: Pruning Techniques Classificationmentioning
confidence: 99%
“…Score comparison can be performed locally [23], i.e., among the interconnections of the same DNN layer, or globally [10], i.e., among the interconnections of the whole DNN model. As a second criterion, a regularization term [24]- [26] can be imposed on the loss function during training to promote the sparsity of the structure.…”
Section: Pruning Techniques Classificationmentioning
confidence: 99%
“…Score comparison can be performed locally [20], i.e., among the interconnections of the same DNN layer, or globally [9], i.e., among the interconnections of the whole DNN model. As a second criterion, a regularization term [21], [22] can be imposed to the loss function while training in order to promote the sparsity of the structure.…”
Section: Pruning Techniques Classificationmentioning
confidence: 99%