2017
DOI: 10.1063/1.4999613
|View full text |Cite
|
Sign up to set email alerts
|

Detecting dynamical changes in time series by using the Jensen Shannon divergence

Abstract: Most of the time series in nature are a mixture of signals with deterministic and random dynamics. Thus the distinction between these two characteristics becomes important. Distinguishing between chaotic and aleatory signals is difficult because they have a common wide-band power spectrum, a delta-like autocorrelation function, and share other features as well. In general signals are presented as continuous records and require to be discretized for being analyzed. In this work we present different schemes for … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 19 publications
(15 citation statements)
references
References 42 publications
(33 reference statements)
0
14
0
Order By: Relevance
“…With the aim of achieving a better characterization regarding the significance of the results obtained with the PJSD, we have developed some comparative analysis with other two quantifiers of similarity/dissimilarity between signals; namely, the information-based similarity index (IBSI) [32,33] and the alphabetic Jensen-Shannon divergence (aJSD) [38]. A binary coarse graining depending on the relative amplitudes of successive values is first implemented in both of them and, then, binary sequences of length m (m-bit words) are obtained.…”
Section: Appendix: Performance Comparison Against Other Approachesmentioning
confidence: 99%
See 1 more Smart Citation
“…With the aim of achieving a better characterization regarding the significance of the results obtained with the PJSD, we have developed some comparative analysis with other two quantifiers of similarity/dissimilarity between signals; namely, the information-based similarity index (IBSI) [32,33] and the alphabetic Jensen-Shannon divergence (aJSD) [38]. A binary coarse graining depending on the relative amplitudes of successive values is first implemented in both of them and, then, binary sequences of length m (m-bit words) are obtained.…”
Section: Appendix: Performance Comparison Against Other Approachesmentioning
confidence: 99%
“…Much more recently, an approach based on the differences between Rényi entropy spectra has been implemented by Xu and Beck [14]. Finally, the alphabetic Jensen-Shannon divergence, a relative distance (in a distributional sense) between time series, has been introduced by Mateos et al [38] for detecting dynamical changes. A binary encoding is undertaken as a coarsegrained representation of the signals, and then used to estimate the Jensen-Shannon divergence.…”
Section: Introductionmentioning
confidence: 99%
“…We also normalize the number of observations in each bin by the total number of single pulses in each observation. For each of these fittings we perform a Jensen-Shannon divergence test (Mateos et al 2017) to quantify the difference between the empirical and the theoretical distribution. Table 2 gives the scintillation fit parameters of the amplitude versus the number of pulses (the insets in Figs.…”
Section: Scintillationsmentioning
confidence: 99%
“…The difference between sequences ι(s i , s j ) is calculated using the so-called ,,alphabetic Jensen-Shannon Distance" (aJSD, Mateos et al (2017)), which first discretizes timeseries data in a combined probability space, on which then the Jensen-Shannon Distance can be applied (see the algorithm description in eq. 6).…”
Section: Conflict Of Interest Statementmentioning
confidence: 99%