2021
DOI: 10.48550/arxiv.2102.04877
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Noisy Recurrent Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
10
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(11 citation statements)
references
References 0 publications
1
10
0
Order By: Relevance
“…which is consistent with the standard convention used in the literature (e.g., [27,45]), in which the weight matrix acts on the output of the activation function, rather than appearing in its argument as in [48]. For example, to recover a multilayer perceptron with preactivations at layer defined as h = W ij φ(h −1 j ) + b , we set γ = 0 and A = B = [0], and view the discrete timesteps as subsequent layers ∈ [0, L − 1].…”
Section: Self-averaging Random Networksupporting
confidence: 80%
See 1 more Smart Citation
“…which is consistent with the standard convention used in the literature (e.g., [27,45]), in which the weight matrix acts on the output of the activation function, rather than appearing in its argument as in [48]. For example, to recover a multilayer perceptron with preactivations at layer defined as h = W ij φ(h −1 j ) + b , we set γ = 0 and A = B = [0], and view the discrete timesteps as subsequent layers ∈ [0, L − 1].…”
Section: Self-averaging Random Networksupporting
confidence: 80%
“…Our starting point is the continuous-time formulation of a recurrent neural network (RNN) as a stochastic differential equation (SDE) [48] dh = f (h, x) dt + g(h, x) dB ,…”
Section: Constructing the Partition Functionmentioning
confidence: 99%
“…Our starting point is the continuous-time formulation of a recurrent neural network (RNN) as a stochastic differential equation (SDE) [51]…”
Section: Constructing the Partition Functionmentioning
confidence: 99%
“…In principle however, one could consider other forms of stochasticity; see [51] and references therein.…”
Section: A1 Simplifying Assumptionsmentioning
confidence: 99%
“…More generally, the connection between deep learning and dynamical systems has recently motivated several physicsbased models [2,3,17,46,47,57], as well as more stable [11,25,26] and more robust [1,28,44] models.…”
Section: Related Workmentioning
confidence: 99%