2018
DOI: 10.21105/joss.00602
|View full text |Cite
|
Sign up to set email alerts
|

Flux: Elegant machine learning with Julia

Abstract: SummaryFlux is library for machine learning (ML), written using the numerical computing language Julia (Bezanson et al. 2017). The package allows models to be written using Julia's simple mathematical syntax, and applies automatic differentiation (AD) to seamlessly calculate derivatives and train the model. Meanwhile, it makes heavy use of Julia's language and compiler features to carry out code analysis and make optimisations. For example, Julia's GPU compilation support (Besard, Foket, and De Sutter 2017) ca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
177
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 274 publications
(187 citation statements)
references
References 1 publication
0
177
0
Order By: Relevance
“…Embeddings.jl exposes these as a standard matrix of numbers and a corresponding array of strings. This lets Julia programs use word embeddings easily, either on their own or alongside machine learning packages such as Flux (Innes, 2018). In such deep learning packages, it is common to use word embeddings as an input layer of a LSTM (long short term memory) network or other machine learning model, where they may be kept invariant or used as initialization for fine-tuning on the supervised task.…”
Section: Discussionmentioning
confidence: 99%
“…Embeddings.jl exposes these as a standard matrix of numbers and a corresponding array of strings. This lets Julia programs use word embeddings easily, either on their own or alongside machine learning packages such as Flux (Innes, 2018). In such deep learning packages, it is common to use word embeddings as an input layer of a LSTM (long short term memory) network or other machine learning model, where they may be kept invariant or used as initialization for fine-tuning on the supervised task.…”
Section: Discussionmentioning
confidence: 99%
“…Another major advantage of a single-language solution is the ability to automatically differentiate (AD) functions from their code representations. The Flux.jl package (Innes, 2018), for example, already makes use of AD to allow unparalleled flexibility in neural network design.…”
Section: Why Julia?mentioning
confidence: 99%
“…The parameters of the DQN algorithm are summarized in table I. It was implemented using the Flux.jl library [20].…”
Section: A Reinforcement Learning Training Proceduresmentioning
confidence: 99%