2011
DOI: 10.1063/1.3637490
|View full text |Cite
|
Sign up to set email alerts
|

Information symmetries in irreversible processes

Abstract: We study dynamical reversibility in stationary stochastic processes from an information theoretic perspective. Extending earlier work on the reversibility of Markov chains, we focus on finitary processes with arbitrarily long conditional correlations. In particular, we examine stationary processes represented or generated by edge-emitting, finite-state hidden Markov models. Surprisingly, we find pervasive temporal asymmetries in the statistics of such stationary processes with the consequence that the computat… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2011
2011
2024
2024

Publication Types

Select...
6

Relationship

4
2

Authors

Journals

citations
Cited by 21 publications
(23 citation statements)
references
References 42 publications
(93 reference statements)
0
23
0
Order By: Relevance
“…Pr(σ + = (τ, g, x)|σ − = (g , x , τ )) = Pr(τ |g, x, τ )δ x,x Pr(g|g , x , x, τ ) = φ g,x (τ + τ ) Pr(g|g , x)δ x,x , where we obtain p(g|g , x) from standard methods [12,26] applied to (only) the dynamic on G. Note that p(g|g , x , x, τ ) reduces to p(g|g , x ) as g and x uniquely specify the distribution from which τ is drawn and since x = x . We leave the the final steps to E as an exercise.…”
Section: Unifilar Hidden Semi-markov Modelsmentioning
confidence: 99%
“…Pr(σ + = (τ, g, x)|σ − = (g , x , τ )) = Pr(τ |g, x, τ )δ x,x Pr(g|g , x , x, τ ) = φ g,x (τ + τ ) Pr(g|g , x)δ x,x , where we obtain p(g|g , x) from standard methods [12,26] applied to (only) the dynamic on G. Note that p(g|g , x , x, τ ) reduces to p(g|g , x ) as g and x uniquely specify the distribution from which τ is drawn and since x = x . We leave the the final steps to E as an exercise.…”
Section: Unifilar Hidden Semi-markov Modelsmentioning
confidence: 99%
“…In general, the forward-and reverse-time statistical complexities are not equal [22,23]. That is, different amounts of information must be stored from the past (future) to predict (retrodict).…”
Section: A Processes and Their Causal Statesmentioning
confidence: 99%
“…A corollary is that the predictive information bottleneck-compression of semi-infinite pasts to retain information about semi-infinite futures-can be recast as compression of forward-time causal states to retain information about reverse-time causal states. The joint probability distribution of forward-and reverse-time causal states may seem somewhat elusive, but previous work has shown that this joint probability distribution can be obtained given the process' model [23].…”
Section: Recasting Predictive Rate Distortion Theorymentioning
confidence: 99%
“…This HMM, parametrized by z, produces the golden mean process at all z ∈ ½:5; 1, but the hidden states share less and less information with the output past as z increases, as shown by Ref. [36]. The extreme at z ¼ 0.5 corresponds to the minimal predictive generator, the ϵ-machine.…”
Section: Retrodictive Generatorsmentioning
confidence: 96%
“…This is shown in Fig. 2 via an information diagram-a tool that lays out informational interdependencies between random variables [35] and has been particularly useful in analyzing temporal information processing [36,37]. Figure 2 also shows that the modularity dissipation, highlighted by a dashed red outline, can be reexpressed as the mutual information between the noninteracting stationary system Z s and the interacting system Z i before the computation that is not shared with Z i after the computation:…”
Section: Global Versus Localized Processingmentioning
confidence: 99%