2020
DOI: 10.1016/j.neunet.2020.01.010
|View full text |Cite
|
Sign up to set email alerts
|

Performance boost of time-delay reservoir computing by non-resonant clock cycle

Abstract: The time-delay-based reservoir computing setup has seen tremendous success in both experiment and simulation. It allows for the construction of large neuromorphic computing systems with only few components. However, until now the interplay of the different timescales has not been investigated thoroughly. In this manuscript, we investigate the effects of a mismatch between the time-delay and the clock cycle for a general model. Typically, these two time scales are considered to be equal. Here we show that the c… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
49
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 49 publications
(53 citation statements)
references
References 23 publications
1
49
0
1
Order By: Relevance
“…Depicted by the vertical red dashed lines are multiples of the input period time T at which the total memory capacity MC drops again significantly to around 40. A drop in the linear memory capacity was discussed in the paper by Stelzer et al [38] and explained by the fact that resonances between the delay time τ and the input period time T concludes in a sparse connection between the virtual nodes.…”
Section: { N}mentioning
confidence: 99%
“…Depicted by the vertical red dashed lines are multiples of the input period time T at which the total memory capacity MC drops again significantly to around 40. A drop in the linear memory capacity was discussed in the paper by Stelzer et al [38] and explained by the fact that resonances between the delay time τ and the input period time T concludes in a sparse connection between the virtual nodes.…”
Section: { N}mentioning
confidence: 99%
“…Hence, they can be employed in Machine-Learning applications with weight training. In our recent paper [50], we show that a single DDE can emulate a deep neural network and perform various computational tasks successfully. More specifically, the work [50] derives a multilayer neural network from a delay system with modulated feedback terms.…”
Section: Introductionmentioning
confidence: 97%
“…In our recent paper [50], we show that a single DDE can emulate a deep neural network and perform various computational tasks successfully. More specifically, the work [50] derives a multilayer neural network from a delay system with modulated feedback terms. This neural network is trained by gradient descent using backpropagation and applied to machine-learning tasks.…”
Section: Introductionmentioning
confidence: 97%
See 2 more Smart Citations