2022
DOI: 10.48550/arxiv.2204.03418
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Continual Inference: A Library for Efficient Online Inference with Deep Neural Networks in PyTorch

Abstract: We present Continual Inference, a Python library for implementing Continual Inference Networks (CINs) in PyTorch, a class of Neural Networks designed specifically for efficient inference in both online and batch processing scenarios. We offer a comprehensive introduction and guide to CINs and their implementation in practice, and provide best-practices and code examples for composing complex modules for modern Deep Learning. Continual Inference is readily downloadable via the Python Package Index and at www.gi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 16 publications
(23 reference statements)
0
1
0
Order By: Relevance
“…, f k+1 ) are processed to get the result for time step k + 1, and so on. Continual Inference Networks [127] can exploit redundancies in the computation of overlapping inputs in order to increase the overall efficiency. Continual 3-Dimensional Convolutional Neural Networks (Co3D CNNs) reuse pre-existing 3D CNN weights to reduce the FLOPs for each prediction while retaining similar memory requirements and accuracy [128], and Continual Transformers reuse the redundancy in the selfattention operation of overlapping windows to greatly reduce the time and memory complexity per prediction [129].…”
Section: F Alternatives For High-resolution Deep Learning Methodsmentioning
confidence: 99%
“…, f k+1 ) are processed to get the result for time step k + 1, and so on. Continual Inference Networks [127] can exploit redundancies in the computation of overlapping inputs in order to increase the overall efficiency. Continual 3-Dimensional Convolutional Neural Networks (Co3D CNNs) reuse pre-existing 3D CNN weights to reduce the FLOPs for each prediction while retaining similar memory requirements and accuracy [128], and Continual Transformers reuse the redundancy in the selfattention operation of overlapping windows to greatly reduce the time and memory complexity per prediction [129].…”
Section: F Alternatives For High-resolution Deep Learning Methodsmentioning
confidence: 99%