2021
DOI: 10.48550/arxiv.2111.00254
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Equinox: neural networks in JAX via callable PyTrees and filtered transformations

Abstract: JAX and PyTorch are two popular Python autodifferentiation frameworks. JAX is based around pure functions and functional programming. PyTorch has popularised the use of an object-oriented (OO) class-based syntax for defining parameterised functions, such as neural networks. That this seems like a fundamental difference means current libraries for building parameterised functions in JAX have either rejected the OO approach entirely (Stax) or have introduced OO-tofunctional transformations, multiple new abstract… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 5 publications
(6 reference statements)
0
1
0
Order By: Relevance
“…( 2) Operator overloading, which computes derivatives in real time using operators coded in classes. Some Python libraries that implement this technique are Auto_diff (Nobel, 2020), Autograd (Maclaurin et al, 2015), JAX (Kidger and Garcia, 2021), Chainer (Tokui et al, 2019) and PyTorch (Paszke et al, 2017).…”
Section: Methodsmentioning
confidence: 99%
“…( 2) Operator overloading, which computes derivatives in real time using operators coded in classes. Some Python libraries that implement this technique are Auto_diff (Nobel, 2020), Autograd (Maclaurin et al, 2015), JAX (Kidger and Garcia, 2021), Chainer (Tokui et al, 2019) and PyTorch (Paszke et al, 2017).…”
Section: Methodsmentioning
confidence: 99%
“…The code is written in Python and utilizes JAX [6] for performing automatic differentiation and numerical optimization for model fitting. All models were implemented from scratch, and we utilize equinox [17] for maintaining readability and elegance of the code. In our code base repository, we provide the code and scripts for reproducing all results in this paper.…”
Section: Supplementary Materialsmentioning
confidence: 99%