2002
DOI: 10.1103/physreve.65.041905
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of symbolic sequences using the Jensen-Shannon divergence

Abstract: We study statistical properties of the Jensen-Shannon divergence D, which quantifies the difference between probability distributions, and which has been widely applied to analyses of symbolic sequences. We present three interpretations of D in the framework of statistical physics, information theory, and mathematical statistics, and obtain approximations of the mean, the variance, and the probability distribution of D in random, uncorrelated sequences. We present a segmentation method based on D that is able … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
254
0
3

Year Published

2005
2005
2016
2016

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 257 publications
(257 citation statements)
references
References 37 publications
0
254
0
3
Order By: Relevance
“…In addition to previously used measures such as the maximum spiking rate (24) and total spike count (9,19) within the time course of the response ( Fig. 2 A and B), we also explored the "total information gain" as measured by D JS , which has been used in time series analysis of discrete events (25) and appeared to be suitable for quantifying differences between spike trains (Fig. 2).…”
Section: Resultsmentioning
confidence: 99%
“…In addition to previously used measures such as the maximum spiking rate (24) and total spike count (9,19) within the time course of the response ( Fig. 2 A and B), we also explored the "total information gain" as measured by D JS , which has been used in time series analysis of discrete events (25) and appeared to be suitable for quantifying differences between spike trains (Fig. 2).…”
Section: Resultsmentioning
confidence: 99%
“…Grosse et al [35] pointed out that the Jensen-Shannon divergence (JSD) is extremely useful when it comes to discriminate between two (or more) sources. Capra and Singh [34] carefully discussed several information theoretic measures like Shannon entropy, von Neumann entropy, relative entropy, and sum-of-pair measures to assess sequence conservation.…”
Section: Methodsmentioning
confidence: 99%
“…For this aim, we introduce and evaluate a new information theory-based method for the prediction of these residues using Jensen-Shannon divergence (JSD). As a divergence measure based on the Shannon entropy, JSD is a symmetrized and smoothed version of the Kullback-Leibler divergence and is often used for different problems in the field of bioinformatics [31][32][33][34][35]. In this study, following the line of Capra et al [34] we first quantify the divergence between the observed amino acid distribution of a site in a protein and the background distribution of non-binding sites by using JSD.…”
Section: Introductionmentioning
confidence: 99%
“…2, the JensenShannon distance between the distributions from sequential UniGen2 and IS is 0.049, while the corresponding figure for parallel UniGen2 and IS is 0.052. These small Jensen-Shannon distances make the distribution of UniGen2 (whether sequential or parallel) indistinguishable from that of IS (See Section IV(C) of [13]). …”
Section: Uniformity Comparisonmentioning
confidence: 99%