2020
DOI: 10.48550/arxiv.2009.09346
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

TorchDyn: A Neural Differential Equations Library

Abstract: Continuous-depth learning has recently emerged as a novel perspective on deep learning, improving performance in tasks related to dynamical systems and density estimation. Core to these approaches is the neural differential equation, whose forward passes are the solutions of an initial value problem parametrized by a neural network. Unlocking the full potential of continuous-depth models requires a different set of software tools, due to peculiar differences compared to standard discrete neural networks, e.g i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
1

Relationship

2
5

Authors

Journals

citations
Cited by 9 publications
(14 citation statements)
references
References 17 publications
(37 reference statements)
0
10
0
Order By: Relevance
“…The density estimation tasks were conducted on flows equipped with Free-form Jacobian of Reversible Dynamics (FFJORD) (Grathwohl et al, 2019), using adaptive ODE solvers (Dormand and Prince, 1980). We used two code bases (FFJORD from Grathwohl et al (2019) and TorchDyn (Poli et al, 2020a)) over which we implemented our pruning framework. 3 0 50…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The density estimation tasks were conducted on flows equipped with Free-form Jacobian of Reversible Dynamics (FFJORD) (Grathwohl et al, 2019), using adaptive ODE solvers (Dormand and Prince, 1980). We used two code bases (FFJORD from Grathwohl et al (2019) and TorchDyn (Poli et al, 2020a)) over which we implemented our pruning framework. 3 0 50…”
Section: Methodsmentioning
confidence: 99%
“…All code and data which contains the details of the hyperparameters used in all experiments are openly accessible online at: https://github.com/lucaslie/torchprune For the experiments on the toy datasets, we based our code on the TorchDyn library (Poli et al, 2020a). For the experiments on the tabular datasets and image experiments, we based our code on the official code repository of FFJORD (Grathwohl et al, 2019).…”
Section: S2 Reproducibility Mattersmentioning
confidence: 99%
“…In example, forward sensitivity methods benefit from a breakdown of matrix-jacobian products into a vmapped vector-jacobian products. We have developed a PyTorch library designed for broader compatibility with the neural differential equation ecosystem e.g torchdiffeq (Chen et al, 2018) and torchdyn (Poli et al, 2020b). Here, we provide code for several key methods and classes.…”
Section: Additional Details On the Realization Of Mslsmentioning
confidence: 99%
“…Differential equations are the language of science and engineering. As methods (Jia and Benson, 2019) and software frameworks (Rackauckas et al, 2019;Poli et al, 2020b) are improved, yielding performance gains or speedups (Poli et al, 2020a;Kidger et al, 2020a;Pal et al, 2021), the range of applicability of neural differential equations is extended to more complex and larger scale problems. As with other techniques designed to reduce overall training time, we expect a net positive environment impact from the adoption of MSLs in the framework.…”
Section: C5 Broader Impactmentioning
confidence: 99%
“…The simulations have been carried out on a machine equipped with an AMD RYZEN THREADRIPPER 3960X CPU and two NVIDIA RTX 3090 graphic cards. The software has been implemented in Python using torchdyn [35] and torchdiffeq [17] libraries. In all the experiments, ADAM optimizer [36] has been used to perform the gradient descent iterations and, unless differently specified, its learning rate has been set to 10 −3 .…”
Section: A Experimental Setupmentioning
confidence: 99%