2021
DOI: 10.48550/arxiv.2106.10820
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Stateful ODE-Nets using Basis Function Expansions

Abstract: The recently-introduced class of ordinary differential equation networks (ODE-Nets) establishes a fruitful connection between deep learning and dynamical systems. In this work, we reconsider formulations of the weights as continuous-depth functions using linear combinations of basis functions. This perspective allows us to compress the weights through a change of basis, without retraining, while maintaining near state-of-the-art performance. In turn, both inference time and the memory footprint are reduced, en… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 36 publications
0
1
0
Order By: Relevance
“…The use of ODE-based learning architectures has also received considerable attention in recent years with examples such as continuous-time neural ODEs [9,34,33] and their recurrent extensions ODE-RNNs [35], as well as RNNs based on discretizations of ODEs [7,12,10,28,36,37].…”
Section: Related Workmentioning
confidence: 99%
“…The use of ODE-based learning architectures has also received considerable attention in recent years with examples such as continuous-time neural ODEs [9,34,33] and their recurrent extensions ODE-RNNs [35], as well as RNNs based on discretizations of ODEs [7,12,10,28,36,37].…”
Section: Related Workmentioning
confidence: 99%