2023
DOI: 10.1063/5.0143846
|View full text |Cite
|
Sign up to set email alerts
|

Effect of temporal resolution on the reproduction of chaotic dynamics via reservoir computing

Abstract: Reservoir computing is a machine learning paradigm that uses a structure called a reservoir, which has nonlinearities and short-term memory. In recent years, reservoir computing has expanded to new functions such as the autonomous generation of chaotic time series, as well as time series prediction and classification. Furthermore, novel possibilities have been demonstrated, such as inferring the existence of previously unseen attractors. Sampling, in contrast, has a strong influence on such functions. Sampling… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 24 publications
0
4
0
Order By: Relevance
“…Another approach to improve the reservoir performance by incorporating task specific memory requirements is to modify the task itself. When constructing a time series model from data, the discretisation ∆t (sampling step) of the input timeseries can have a significant influence on the model and on the resulting performance [37,51]. In the context of reservoir computing, the input discretisation influences the memory required of the reservoir and is related to Takens (delay) embedding [34].…”
Section: Input Discretisationmentioning
confidence: 99%
See 2 more Smart Citations
“…Another approach to improve the reservoir performance by incorporating task specific memory requirements is to modify the task itself. When constructing a time series model from data, the discretisation ∆t (sampling step) of the input timeseries can have a significant influence on the model and on the resulting performance [37,51]. In the context of reservoir computing, the input discretisation influences the memory required of the reservoir and is related to Takens (delay) embedding [34].…”
Section: Input Discretisationmentioning
confidence: 99%
“…For our study, only the latter case is relevant as we only provide one of the three dynamical variables for the Lorenz 63 task (and not the complete history for the Mackey-Glass task). The sampling step for an optimal embedding for the Lorenz 63 system is about 0.1 [37,51]. Figure 3(c) shows the autocorrelation of the X variable (green line) and the cross-correlation between the X and Z variables (orange line) for the chaotic dynamics (figures 3(a) and (b)) with the parameters given above.…”
Section: Lorenz 63 Time-series Predictionmentioning
confidence: 99%
See 1 more Smart Citation
“…For a one step ahead x-to-x prediction task the vector x(k) is the input and o(k) = x(k+1) is the target. It is noted that the sampling step has an impact both on the difficulty of the task and on its memory requirements [49,82]. Our focus is on the interrelation between the bifurcation structure and the performance, which is why only one task is used for the discussion (other task have been performed to ensure that the results are robust and are shown in the Supplementary material).…”
Section: Input Data and Prediction Taskmentioning
confidence: 99%
“…However, in a work about the power consumption forecasting system, the root-mean-square error of prediction results of day-level data is about 40% lower than that of hour-level data [25]. Recently, Tsuchiyama et al found that the prediction capacity of Rössler system is about 4 times better at sampling intervals = 0.05 than at sampling intervals = 0.01, which means that a smaller sampling interval does not always contribute to a better prediction performance [26]. Inspired by their results, we aim to further investigate this counterintuitive phenomenon and explore the existence of the optimal sampling intervals and the maximum one for training in many other chaotic system.…”
Section: Introductionmentioning
confidence: 97%