Abstract:In this contribution, a comparison between different permutation entropies as classifiers of electroencephalogram (EEG) records corresponding to normal and pre-ictal states is made. A discrete probability distribution function derived from symbolization techniques applied to the EEG signal is used to calculate the Tsallis entropy, Shannon Entropy, Renyi Entropy, and Min Entropy, and they are used separately as the only independent variable in a logistic regression model in order to evaluate its capacity as a classification variable in a inferential manner. The area under the Receiver Operating Characteristic (ROC) curve, along with the accuracy, sensitivity, and specificity are used to compare the models. All the permutation entropies are excellent classifiers, with an accuracy greater than 94.5% in every case, and a sensitivity greater than 97%. Accounting for the amplitude in the symbolization technique retains more information of the signal than its counterparts, and it could be a good candidate for automatic classification of EEG signals.
In 2002, Bandt and Pompe [Phys. Rev. Lett. 88, 174102 (2002)] introduced a successfully symbolic encoding scheme based on the ordinal relation between the amplitude of neighboring values of a given data sequence, from which the permutation entropy can be evaluated. Equalities in the analyzed sequence, for example, repeated equal values, deserve special attention and treatment as was shown recently by Zunino and co-workers [Phys. Lett. A 381, 1883 (2017)]. A significant number of equal values can give rise to false conclusions regarding the underlying temporal structures in practical contexts. In the present contribution, we review the different existing methodologies for treating time series with tied values by classifying them according to their different strategies. In addition, a novel data-driven imputation is presented that proves to outperform the existing methodologies and avoid the false conclusions pointed by Zunino and co-workers.
In nonlinear dynamics, and to a lesser extent in other fields, a widely used measure of complexity is the Permutation Entropy. But there is still no known method to determine the accuracy of this measure. There has been little research on the statistical properties of this quantity that characterize time series.The literature describes some resampling methods of quantities used in nonlinear dynamics -as the largest Lyapunov exponent -but all of these seems to fail. In this contribution we propose a parametric bootstrap methodology using a symbolic representation of the time series in order to obtain the distribution of the Permutation Entropy estimator. We perform several time series simulations given by well known stochastic processes: the 1/f α noise family, and show in each case that the proposed accuracy measure is as efficient as the one obtained by the frequentist approach of repeating the experiment. The complexity of brain electrical activity, measured by the Permutation Entropy, has been extensively used in epilepsy research for detection in dynamical changes in electroencephalogram (EEG) signal with no consideration of the variability of this complexity measure. An application of the parametric bootstrap methodology is used to compare normal and pre-ictal EEG signals.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.