2019
DOI: 10.48550/arxiv.1910.12232
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Network Distiller: A Python Package For DNN Compression Research

Neta Zmora,
Guy Jacob,
Lev Zlotnik
et al.

Abstract: This paper presents the philosophy, design and feature-set of Neural Network Distiller, an open-source Python package for DNN compression research. Distiller is a library of DNN compression algorithms implementations, with tools, tutorials and sample applications for various learning tasks. Its target users are both engineers and researchers, and the rich content is complemented by a design-for-extensibility to facilitate new research. Distiller is open-source and is available on Github at https://github.com/N… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 14 publications
0
10
0
Order By: Relevance
“…The diversity compression mechanisms and limited support by deep learning frameworks led to development of specialized software libraries such as Distiller [45] and NCCF [21]. Distiller and NCCF gather multiple compression schemes and corresponding training algorithms into a single framework, and make it easier to apply to new models.…”
Section: Compression Frameworkmentioning
confidence: 99%
“…The diversity compression mechanisms and limited support by deep learning frameworks led to development of specialized software libraries such as Distiller [45] and NCCF [21]. Distiller and NCCF gather multiple compression schemes and corresponding training algorithms into a single framework, and make it easier to apply to new models.…”
Section: Compression Frameworkmentioning
confidence: 99%
“…size by quantizing model parameters and pruning redundant neurons, and many of such methods are covered by Distiller [53], an open-source library for model compression.…”
Section: Frameworkmentioning
confidence: 99%
“…However, such frameworks are usually not either well generalized or maintained to be built on. Besides, Distiller [53] supports only one method for knowledge distillation, and Catalyst [16] is a framework built on PyTorch with a focus on reproducibility of deep learning research. To support various deep learning methods, these frameworks are well generalized, yet require users to hardcode (reimplement) critical modules such as models and datasets, even if the implementations are publicly available in popular libraries, to design complex knowledge distillation experiments.…”
Section: Frameworkmentioning
confidence: 99%
“…For instance, [31] sequentially prunes and retrains on a per layer basis, while works such as [37] have to add many auxiliary layers on top of the chosen architecture in order to create and train their early exit classifiers. Distiller [41] and Mayo [39] are two state-of-the-art open-source frameworks that allow for experimentation with such pruning techniques. Mayo focuses on automating search for hyperpameters related to pruning, while Distiller focuses on implementing a wide variety of pruning techniques discussed above.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Based on this information, the PDC automatically calculates all the dependent layers and communicates this information to the pruning stage so layers can be pruned in dependent groups if necessary. This automation makes ADaPT very easy to use as tools such as Distiller [41] require the user to manually identify each dependent convolution in the entire network, which for larger networks can be very tedious to list.…”
Section: Pruning Dependency Calculation (Pdc)mentioning
confidence: 99%