2023
DOI: 10.34133/research.0174
|View full text |Cite
|
Sign up to set email alerts
|

Tipping Point Detection Using Reservoir Computing

Abstract: Detection in high fidelity of tipping points, the emergence of which is often induced by invisible changes in internal structures or/and external interferences, is paramountly beneficial to understanding and predicting complex dynamical systems (CDSs). Detection approaches, which have been fruitfully developed from several perspectives (e.g., statistics, dynamics, and machine learning), have their own advantages but still encounter difficulties in the face of high-dimensional, fluctuating datasets. Here, using… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 44 publications
0
2
0
Order By: Relevance
“…The neural networks (NNs) equipped with the induced biases have remarkable abilities in learning and generalizing the intrinsic kinetics of the underlying systems from the noisy data, such as the Hamiltonian NNs [1,2], the Lagrangian NNs [3], the neural differential equations [4][5][6], and the physics-informed NNs [7][8][9][10][11][12][13][14][15]. These frameworks have been applied successfully to many tasks (e.g., the generative tasks [16], the dynamics reconstruction [17][18][19][20], the intelligent control problems [21,22], and the tipping point detection [23,24]), sharing the common idea in design-utilization of an appropriate loss function enforcing the model to nearly obey the physical principles. Although progresses have been outstandingly achieved, these frameworks, which either enlarge the network complexity or overfit the noisy data during the training stage to decrease the loss, are suffered from the poor generalization abilities.…”
mentioning
confidence: 99%
“…The neural networks (NNs) equipped with the induced biases have remarkable abilities in learning and generalizing the intrinsic kinetics of the underlying systems from the noisy data, such as the Hamiltonian NNs [1,2], the Lagrangian NNs [3], the neural differential equations [4][5][6], and the physics-informed NNs [7][8][9][10][11][12][13][14][15]. These frameworks have been applied successfully to many tasks (e.g., the generative tasks [16], the dynamics reconstruction [17][18][19][20], the intelligent control problems [21,22], and the tipping point detection [23,24]), sharing the common idea in design-utilization of an appropriate loss function enforcing the model to nearly obey the physical principles. Although progresses have been outstandingly achieved, these frameworks, which either enlarge the network complexity or overfit the noisy data during the training stage to decrease the loss, are suffered from the poor generalization abilities.…”
mentioning
confidence: 99%
“…For time series prediction, RC assumes the role of regression, taking input as a segment of time series up to a certain time and draws predictions for the next (few) time steps. Examples are abundant, including prediction of chaotic dynamics such as Mackey-Glass equations 11 , 31 , 34 , 51 , 95 , Lorenz system 22 , 26 , 49 , 51 , 95 – 97 , Santa Fe Chaotic time series 16 , 86 , 89 , 95 , Ikeda system 95 , auto-regressive moving average (NARMA) sequence 16 , 28 , 29 , 93 , 94 , 98 , Hénon map 16 , 35 , 95 , 98 , radar signal 68 , language sentence 36 , stocks data 61 , sea surface temperatures (SST) 99 , traffic breakdown 100 – 102 , tool wear detection 97 and wind power 103 . Given a training time series and prescribed prediction horizon τ , the input sequence of RC can be defined as u ( t ) = z ( t ) while the target output as y ( t ) = z ( t + τ ).…”
Section: Application Benchmarks Of Rcmentioning
confidence: 99%
“…Complex diseases are often the result of alterations in homeostasis induced by environmental or genetic factors. Extensive experimental and clinical evidence indicates that complex disease evolution is not always marked by a gradual pattern but rather distinguished by abrupt and qualitative alterations in the states of the system when reaching a critical transition or tipping point [ 1 , 2 ]. Accordingly, disregarding the particular discrepancies in clinical manifestations and biological mechanisms across diverse ailments, disease evolution can be broken down into three distinct states: a stable relatively normal state, a pre-deterioration state characterized by diminished resilience and heightened susceptibility, and another stable deteriorated state (Fig.…”
Section: Introductionmentioning
confidence: 99%