2020
DOI: 10.1038/s41598-020-61135-7
|View full text |Cite
|
Sign up to set email alerts
|

Generic predictions of output probability based on complexities of inputs and outputs

Abstract: For a broad class of input-output maps, arguments based on the coding theorem from algorithmic information theory (AIT) predict that simple (low Kolmogorov complexity) outputs are exponentially more likely to occur upon uniform random sampling of inputs than complex outputs are. Here, we derive probability bounds that are based on the complexities of the inputs as well as the outputs, rather than just on the complexities of the outputs. The more that outputs deviate from the coding theorem bound, the lower the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
64
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1
1

Relationship

2
5

Authors

Journals

citations
Cited by 25 publications
(66 citation statements)
references
References 37 publications
2
64
0
Order By: Relevance
“…Similarly, a larger robustness may also enhance the ability of a system to encode cryptic variation, facilitating access to new phenotypes [44]. A natural tendency towards simpler and more robust structures may therefore facilitate the emergence of modularity, where individual components can evolve independently [45], and so make living systems more globally evolvable.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…Similarly, a larger robustness may also enhance the ability of a system to encode cryptic variation, facilitating access to new phenotypes [44]. A natural tendency towards simpler and more robust structures may therefore facilitate the emergence of modularity, where individual components can evolve independently [45], and so make living systems more globally evolvable.…”
Section: Discussionmentioning
confidence: 99%
“…( 17) only provides an upper bound. Nevertheless, a statistical lower bound can be derived [11,45] showing that most of the probability weight in P (x) will be close to the bound (17). Interestingly, outputs that are far from the bound can be shown to have inputs (in this case genomes or bonding patterns) that themselves are unusually simple [45].…”
Section: S5 Supplementary Text For Algorithmic Information Theory and Complexity Measuresmentioning
confidence: 97%
See 2 more Smart Citations
“…Moreover, algorithmic probability estimates have successfully been made via a weaker form of Levin's coding theorem [13], applicable in real-world contexts. This weaker form was applied in a range of inputoutput maps to make a priori predictions regarding the probability of different shapes and patterns, such as the probability of different RNA shapes appearing on a random choice of genetic sequence, or the probability of differential equation solution profile shapes, on random choice of input parameters, and several other examples [13,14]. Surprisingly, it was found that probability estimates could be made directly from the complexities of the shapes themselves, without recourse to the details of the map or reference to how the shapes were generated.…”
Section: Introductionmentioning
confidence: 99%