2016
DOI: 10.1016/j.biosystems.2015.11.008
|View full text |Cite
|
Sign up to set email alerts
|

Understanding how replication processes can maintain systems away from equilibrium using Algorithmic Information Theory

Abstract: Replication can be envisaged as a computational process that is able to generate and maintain order far-from-equilibrium. Replication processes, can self-regulate, as the drive to replicate can counter degradation processes that impact on a system. The capability of replicated structures to access high quality energy and eject disorder allows Landauer's principle, in conjunction with Algorithmic Information Theory, to quantify the entropy requirements to maintain a system far-from-equilibrium. Using Landauer's… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(21 citation statements)
references
References 32 publications
0
21
0
Order By: Relevance
“…The algorithmic entropy aligns with the Shannon entropy and the entropy of thermodynamics [2,7,8]. Whenever pattern or structure is observed in a string defining a natural system, an algorithm can be constructed to provide a compressed description of the string, although unrecognised structure may allow further compression.…”
Section: Natural Ordering Processesmentioning
confidence: 87%
See 4 more Smart Citations
“…The algorithmic entropy aligns with the Shannon entropy and the entropy of thermodynamics [2,7,8]. Whenever pattern or structure is observed in a string defining a natural system, an algorithm can be constructed to provide a compressed description of the string, although unrecognised structure may allow further compression.…”
Section: Natural Ordering Processesmentioning
confidence: 87%
“…This measure is similar to the Gács entropy, Kolmogorov's Algorithmic Minimum Sufficient Statistic and Zureks intuitive physical entropy concept [2]. The entropy of Shannon's information theory, by contrast, is a characteristic of the macrostate itself and corresponds to H(Shannon) = log 2 Ω.…”
Section: The Provisional Entropy Of a Set Of Statesmentioning
confidence: 93%
See 3 more Smart Citations