We study the stability of asymptotic states displayed by a complex neural network. We focus on the loss of stability of a stationary state of networks using recurrence quantifiers as tools to diagnose local and global stabilities as well as the multistability of a coupled neural network. Numerical simulations of a neural network composed of 1024 neurons in a small-world connection scheme are performed using the model of Braun et al. [Int. J. Bifurcation Chaos 08, 881 (1998)IJBEE40218-127410.1142/S0218127498000681], which is a modified model from the Hodgkin-Huxley model [J. Phys. 117, 500 (1952)]. To validate the analyses, the results are compared with those produced by Kuramoto's order parameter [Chemical Oscillations, Waves, and Turbulence (Springer-Verlag, Berlin Heidelberg, 1984)]. We show that recurrence tools making use of just integrated signals provided by the networks, such as local field potential (LFP) (LFP signals) or mean field values bring new results on the understanding of neural behavior occurring before the synchronization states. In particular we show the occurrence of different stationary and nonstationarity asymptotic states.
Anomalous phase synchronization describes a synchronization phenomenon occurring even for the weakly coupled network and characterized by a non-monotonous dependence of the synchronization strength on the coupling strength. Its existence may support a theoretical framework to some neurological diseases, such as Parkinson’s and some episodes of seizure behavior generated by epilepsy. Despite the success of controlling or suppressing the anomalous phase synchronization in neural networks applying external perturbations or inducing ambient changes, the origin of the anomalous phase synchronization as well as the mechanisms behind the suppression is not completely known. Here, we consider networks composed of N=2000 coupled neurons in a small-world topology for two well known neuron models, namely, the Hodgkin-Huxley-like and the Hindmarsh-Rose models, both displaying the anomalous phase synchronization regime. We show that the anomalous phase synchronization may be related to the individual behavior of the coupled neurons; particularly, we identify a strong correlation between the behavior of the inter-bursting-intervals of the neurons, what we call neuron variability, to the ability of the network to depict anomalous phase synchronization. We corroborate the ideas showing that external perturbations or ambient parameter changes that eliminate anomalous phase synchronization and at the same time promote small changes in the individual dynamics of the neurons, such that an increasing individual variability of neurons implies a decrease of anomalous phase synchronization. Finally, we demonstrate that this effect can be quantified using a well known recurrence quantifier, the “determinism.” Moreover, the results obtained by the determinism are based on only the mean field potential of the network, turning these measures more suitable to be used in experimental situations.
The recurrence analysis of dynamic systems has been studied since Poincaré’s seminal work. Since then, several approaches have been developed to study recurrence properties in nonlinear dynamical systems. In this work, we study the recently developed entropy of recurrence microstates. We propose a new quantifier, the maximum entropy (Smax). The new concept uses the diversity of microstates of the recurrence plot and is able to set automatically the optimum recurrence neighborhood (ϵ—vicinity), turning the analysis free of the vicinity parameter. In addition, ϵ turns out to be a novel quantifier of dynamical properties itself. We apply Smax and the optimum ϵ to deterministic and stochastic systems. The Smax quantifier has a higher correlation with the Lyapunov exponent and, since it is a parameter-free measure, a more useful recurrence quantifier of time series.
Extracting relevant properties of empirical signals generated by nonlinear, stochastic, and high-dimensional systems is a challenge of complex systems research. Open questions are how to differentiate chaotic signals from stochastic ones, and how to quantify nonlinear and/or high-order temporal correlations. Here we propose a new technique to reliably address both problems. Our approach follows two steps: first, we train an artificial neural network (ANN) with flicker (colored) noise to predict the value of the parameter, $$\alpha$$
α
, that determines the strength of the correlation of the noise. To predict $$\alpha$$
α
the ANN input features are a set of probabilities that are extracted from the time series by using symbolic ordinal analysis. Then, we input to the trained ANN the probabilities extracted from the time series of interest, and analyze the ANN output. We find that the $$\alpha$$
α
value returned by the ANN is informative of the temporal correlations present in the time series. To distinguish between stochastic and chaotic signals, we exploit the fact that the difference between the permutation entropy (PE) of a given time series and the PE of flicker noise with the same $$\alpha$$
α
parameter is small when the time series is stochastic, but it is large when the time series is chaotic. We validate our technique by analysing synthetic and empirical time series whose nature is well established. We also demonstrate the robustness of our approach with respect to the length of the time series and to the level of noise. We expect that our algorithm, which is freely available, will be very useful to the community.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.