The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
2015
DOI: 10.3389/frobt.2015.00027
|View full text |Cite
|
Sign up to set email alerts
|

Bounded Rationality, Abstraction, and Hierarchical Decision-Making: An Information-Theoretic Optimality Principle

Abstract: Abstraction and hierarchical information processing are hallmarks of human and animal intelligence underlying the unrivaled flexibility of behavior in biological systems. Achieving such flexibility in artificial systems is challenging, even with more and more computational power. Here, we investigate the hypothesis that abstraction and hierarchical information processing might in fact be the consequence of limitations in information-processing power. In particular, we study an information-theoretic framework o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

2
161
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 89 publications
(163 citation statements)
references
References 76 publications
2
161
0
Order By: Relevance
“…This lack of knowledge of statistical properties of the task leads to non-optimal encoding and large information costs. This is because participants will have to start from uninformative prior distributions p 0 (z), p 0 (y) and p 0 (y|z) that may be far from the true marginal distributions of the task, which they will have to learn across trial repetitions (Genewein et al, 2015;. In this case the additional cost of starting with the wrong priors can be formalized as:…”
Section: The Costs Of Novel or Unfamiliar Tasksmentioning
confidence: 99%
See 4 more Smart Citations
“…This lack of knowledge of statistical properties of the task leads to non-optimal encoding and large information costs. This is because participants will have to start from uninformative prior distributions p 0 (z), p 0 (y) and p 0 (y|z) that may be far from the true marginal distributions of the task, which they will have to learn across trial repetitions (Genewein et al, 2015;. In this case the additional cost of starting with the wrong priors can be formalized as:…”
Section: The Costs Of Novel or Unfamiliar Tasksmentioning
confidence: 99%
“…Beyond this level of compression, relevant information is necessarily discarded and distortion starts to increase, leading to lossy compression. This trade-off is usually implemented as a parameter that constraints the capacity (or maximal mutual information) of the system (Alemi et al, 2016;Denève et al, 2017;Genewein et al, 2015;Kingma and Welling, 2013;Ortega and Braun, 2013;Tishby et al, 2000): , where U represents some utility measure and is a (Lagrangian) factor that adjusts the trade-off between costs and performance. In order to apply this framework to behavioural data, this parameter then has to be fit to observed performance data (Sims, 2016).…”
Section: The Costs Of Controlling the Rate Of Information Processingmentioning
confidence: 99%
See 3 more Smart Citations