Extracting relevant properties of empirical signals generated by nonlinear, stochastic, and high-dimensional systems is a challenge of complex systems research. Open questions are how to differentiate chaotic signals from stochastic ones, and how to quantify nonlinear and/or high-order temporal correlations. Here we propose a new technique to reliably address both problems. Our approach follows two steps: first, we train an artificial neural network (ANN) with flicker (colored) noise to predict the value of the parameter, $$\alpha$$
α
, that determines the strength of the correlation of the noise. To predict $$\alpha$$
α
the ANN input features are a set of probabilities that are extracted from the time series by using symbolic ordinal analysis. Then, we input to the trained ANN the probabilities extracted from the time series of interest, and analyze the ANN output. We find that the $$\alpha$$
α
value returned by the ANN is informative of the temporal correlations present in the time series. To distinguish between stochastic and chaotic signals, we exploit the fact that the difference between the permutation entropy (PE) of a given time series and the PE of flicker noise with the same $$\alpha$$
α
parameter is small when the time series is stochastic, but it is large when the time series is chaotic. We validate our technique by analysing synthetic and empirical time series whose nature is well established. We also demonstrate the robustness of our approach with respect to the length of the time series and to the level of noise. We expect that our algorithm, which is freely available, will be very useful to the community.
In this work, we study the phase synchronization of a neural network and explore how the heterogeneity in the neurons’ dynamics can lead their phases to intermittently phase-lock and unlock. The neurons are connected through chemical excitatory connections in a sparse random topology, feel no noise or external inputs, and have identical parameters except for different in-degrees. They follow a modification of the Hodgkin–Huxley model, which adds details like temperature dependence, and can burst either periodically or chaotically when uncoupled. Coupling makes them chaotic in all cases but each individual mode leads to different transitions to phase synchronization in the networks due to increasing synaptic strength. In almost all cases, neurons’ inter-burst intervals differ among themselves, which indicates their dynamical heterogeneity and leads to their intermittent phase-locking. We argue then that this behavior occurs here because of their chaotic dynamics and their differing initial conditions. We also investigate how this intermittency affects the formation of clusters of neurons in the network and show that the clusters’ compositions change at a rate following the degree of intermittency. Finally, we discuss how these results relate to studies in the neuroscience literature, especially regarding metastability.
Time series analysis comprises a wide repertoire of methods for extracting information from data sets. Despite great advances in time series analysis, identifying and quantifying the strength of nonlinear temporal correlations remain a challenge. We have recently proposed a new method based on training a machine learning algorithm to predict the temporal correlation parameter, α, of flicker noise (FN) time series. The algorithm is trained using as input features the probabilities of ordinal patterns computed from FN time series, xαFN(t), generated with different values of α. Then, the ordinal probabilities computed from the time series of interest, x(t), are used as input features to the trained algorithm and that returns a value, αe, that contains meaningful information about the temporal correlations present in x(t). We have also shown that the difference, Ω, of the permutation entropy (PE) of the time series of interest, x(t), and the PE of a FN time series generated with α=αe, xαeFN(t), allows the identification of the underlying determinism in x(t). Here, we apply our methodology to different datasets and analyze how αe and Ω correlate with well-known quantifiers of chaos and complexity. We also discuss the limitations for identifying determinism in highly chaotic time series and in periodic time series contaminated by noise. The open source algorithm is available on Github.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.