2020
DOI: 10.1101/2020.06.10.145300
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Computation noise promotes cognitive resilience to adverse conditions during decision-making

Abstract: Random noise in information processing systems is widely seen as detrimental to function. But despite the large trial-to-trial variability of neural activity and behavior, humans and other animals show a remarkable adaptability to unexpected adverse events occurring during task execution. This cognitive ability, described as constitutive of general intelligence, is missing from current artificial intelligence (AI) systems which feature exact (noise-free) computations. Here we show that implementing computation… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

5
39
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(44 citation statements)
references
References 53 publications
(60 reference statements)
5
39
0
Order By: Relevance
“…Crucially, in our work we provide a direct link of the necessity of noise for systems that aim at optimizing decision behavior under our encoding and limited-capacity assumptions, which can be seen as algorithmic specifications of the more realistic population coding specifications mentioned above ( Nikitin et al, 2009 ). We argue that our results may provide a formal intuition for the apparent necessity of noise for improving training and learning performance in artificial neural networks ( Dapello et al, 2020 ; Findling and Wyart, 2020 ), and we speculate that an implementation of 'the right' noise distribution for a given environmental statistic could be seen as a potential mechanism to improve performance in capacity-limited agents generally speaking ( Garrett et al, 2011 ). We acknowledge that based on the results of our work, we cannot confirm whether this is the case for higher order neural circuits, however, we leave it as an interesting theoretical formulation, which could be addressed in future work.…”
Section: Discussionmentioning
confidence: 73%
“…Crucially, in our work we provide a direct link of the necessity of noise for systems that aim at optimizing decision behavior under our encoding and limited-capacity assumptions, which can be seen as algorithmic specifications of the more realistic population coding specifications mentioned above ( Nikitin et al, 2009 ). We argue that our results may provide a formal intuition for the apparent necessity of noise for improving training and learning performance in artificial neural networks ( Dapello et al, 2020 ; Findling and Wyart, 2020 ), and we speculate that an implementation of 'the right' noise distribution for a given environmental statistic could be seen as a potential mechanism to improve performance in capacity-limited agents generally speaking ( Garrett et al, 2011 ). We acknowledge that based on the results of our work, we cannot confirm whether this is the case for higher order neural circuits, however, we leave it as an interesting theoretical formulation, which could be addressed in future work.…”
Section: Discussionmentioning
confidence: 73%
“…Finally, there is substantial interindividual variability which does not exist in the optimal solution (Khaw et al, 2021;Nassar et al, 2010Nassar et al, , 2012Prat-Carrabin et al, 2021). In the future, these suboptimalities could be explored using our networks by making them suboptimal in three ways (among others): by stopping training before quasi-optimal performance is reached (Caucheteux & King, 2021;Orhan & Ma, 2017), by constraining the size of the network or its weights (with hard constraints or with regularization penalties) (Mastrogiuseppe & Ostojic, 2017;Sussillo et al, 2015), or by altering the network in a certain way, such as pruning some of the units or some of the connections (Blalock et al, 2020;Chechik et al, 1999;LeCun et al, 1990;Srivastava et al, 2014), or introducing random noise into the activity (Findling et al, 2021;Findling & Wyart, 2020;Legenstein & Maass, 2014). In this way, one could perhaps reproduce the quantitative deviations from optimality while preserving the qualitative aspects of optimality observed in the laboratory.…”
Section: Suboptimalities In Human Behaviormentioning
confidence: 99%
“…Systematic manipulations of evidence order such as the ones we introduced here may thus be helpful in clarifying the computational mechanisms underlying delusions. Further, our results also emphasize that alterations in noisy-sampling (and other limited-capacity) inference processes should be evaluated as candidate explanations for maladaptive or pathological beliefs, particularly given increasing support for their role in adaptive behaviors [ 97 , 98 ].…”
Section: Discussionmentioning
confidence: 92%