Computational Models of Brain and Behavior 2017
DOI: 10.1002/9781119159193.ch33
|View full text |Cite
|
Sign up to set email alerts
|

Complex Probabilistic Inference

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
27
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
2
1

Relationship

4
5

Authors

Journals

citations
Cited by 28 publications
(29 citation statements)
references
References 58 publications
2
27
0
Order By: Relevance
“…In fact, the intermediate sensory representation is itself often assumed to be the result of an inference process over latent variables in an internal model of the world (Mumford, 1992; Lee and Mumford, 2003; Yuille and Kersten, 2006). This process is naturally formalized as hierarchical inference (Figure 1b) in which feedforward connections communicate the likelihood and feedback communicates the prior or other contextual expectations (Fiser et al, 2010; Pouget et al, 2013; Gershman and Beck, 2016; Tajima et al, 2017; Lange and Haefner, 2020).…”
Section: Resultsmentioning
confidence: 99%
“…In fact, the intermediate sensory representation is itself often assumed to be the result of an inference process over latent variables in an internal model of the world (Mumford, 1992; Lee and Mumford, 2003; Yuille and Kersten, 2006). This process is naturally formalized as hierarchical inference (Figure 1b) in which feedforward connections communicate the likelihood and feedback communicates the prior or other contextual expectations (Fiser et al, 2010; Pouget et al, 2013; Gershman and Beck, 2016; Tajima et al, 2017; Lange and Haefner, 2020).…”
Section: Resultsmentioning
confidence: 99%
“…However, it is worth noting that these properties follow from the form of the underlying generative model. The challenge is to identify the appropriate generative model that best explains the generative process (or the empirical responses) of interest (Gershman & Beck, 2017). In the Frozen-Lake simulation, by equipping the agents with beliefs about the current context, we were able (via the generative model and its belief updating process) to convert a learning problem into a planning-as-inference problem.…”
Section: Discussionmentioning
confidence: 99%
“…This means that if we are going to consider expressive generative models, we will need to also consider approximate inference. Historically, approximate inference algorithms have fallen into two families (Gershman and Beck, 2017 ). One family, Monte Carlo algorithms, approximates the posterior via stochastic simulation.…”
Section: Generative Models: Explicit and Implicitmentioning
confidence: 99%