DOI: 10.31274/etd-180810-543
|View full text |Cite
|
Sign up to set email alerts
|

On delay stable communications in asynchronous networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 54 publications
0
0
0
Order By: Relevance
“…It could seem to counter-intuitive that recurrent neural networks models do not have the best performance in this domain, given that they are specifically tailored to process sequential data. There has been recent literature questioning the necessity of recurrent models in certain domains including time series prediction where high dependencies are not needed [190,10] or because the recurrent models used are stable (no gradient problems during the optimisation) and can be approximated by feed-forward models [150]. Some feed-forward and convolutional architectures outperform recurrent ones in some examples of automatic translation [69] or speech synthesis [231].…”
Section: Rnn Edmentioning
confidence: 99%
“…It could seem to counter-intuitive that recurrent neural networks models do not have the best performance in this domain, given that they are specifically tailored to process sequential data. There has been recent literature questioning the necessity of recurrent models in certain domains including time series prediction where high dependencies are not needed [190,10] or because the recurrent models used are stable (no gradient problems during the optimisation) and can be approximated by feed-forward models [150]. Some feed-forward and convolutional architectures outperform recurrent ones in some examples of automatic translation [69] or speech synthesis [231].…”
Section: Rnn Edmentioning
confidence: 99%