2018
DOI: 10.1111/tops.12345
|View full text |Cite
|
Sign up to set email alerts
|

Seeing Patterns in Randomness: A Computational Model of Surprise

Abstract: While seemingly a ubiquitous cognitive process, the precise definition and function of surprise remains elusive. Surprise is often conceptualized as being related to improbability or to contrasts with higher probability expectations. In contrast to this probabilistic view, we argue that surprising observations are those that undermine an existing model, implying an alternative causal origin. Surprises are not merely improbable events; instead, they indicate a breakdown in the model being used to quantify proba… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
17
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 13 publications
(19 citation statements)
references
References 32 publications
(62 reference statements)
2
17
0
Order By: Relevance
“…As illustrated by several papers in this issue, these three approaches need not be mutually exclusive, and can often be complementary. For example, Macedo and Cardoso (2019) propose a Bayesian model, and Maguire, Moser, Maguire, and Keane (2019) propose a potentially complementary information-theoretic account that can deal with surprises that arise when one has no specific expectations (hence no Bayesian prior). In addition, Sim and Xu (2019) discuss both Bayesian (e.g., T egl as et al, 2011) and information-theoretic (e.g., Kidd et al, 2014) approaches that have informed work on infants' reasoning.…”
Section: How Does Surprise Look At Different Levels Of Analysis?mentioning
confidence: 99%
“…As illustrated by several papers in this issue, these three approaches need not be mutually exclusive, and can often be complementary. For example, Macedo and Cardoso (2019) propose a Bayesian model, and Maguire, Moser, Maguire, and Keane (2019) propose a potentially complementary information-theoretic account that can deal with surprises that arise when one has no specific expectations (hence no Bayesian prior). In addition, Sim and Xu (2019) discuss both Bayesian (e.g., T egl as et al, 2011) and information-theoretic (e.g., Kidd et al, 2014) approaches that have informed work on infants' reasoning.…”
Section: How Does Surprise Look At Different Levels Of Analysis?mentioning
confidence: 99%
“…Both Baldi and Itti ( 2010 ) and Maguire et al ( 2018 ) have provided converging theories of surprise which model how people learn from subjectively uncertain outcomes, one based on Bayesian probability and the other on algorithmic information theory.…”
Section: Surprisementioning
confidence: 99%
“…One issue with Baldi and Itti's ( 2010 ) formulation of surprise is that computing it requires identifying a set of relevant hypotheses. Maguire et al ( 2018 ) provide a more generalized theory of surprise based on algorithmic information theory (AIT; see Li and Vitányi, 2008 ). Whereas Baldi and Itti's ( 2010 ) model expresses the informativeness of an observation relative to a pre-defined set of competing hypotheses, Maguire et al's model expresses subjective informativeness in terms of the universal likelihood measure of randomness deficiency (see also Maguire et al, 2014 , 2016 ).…”
Section: Surprisementioning
confidence: 99%
See 2 more Smart Citations