2021
DOI: 10.7554/elife.65074
|View full text |Cite
|
Sign up to set email alerts
|

Likelihood approximation networks (LANs) for fast inference of simulation models in cognitive neuroscience

Abstract: In cognitive neuroscience, computational modeling can formally adjudicate between theories and affords quantitative fits to behavioral/brain data. Pragmatically, however, the space of plausible generative models considered is dramatically limited by the set of models with known likelihood functions. For many models, the lack of a closed-form likelihood typically impedes Bayesian inference methods. As a result, standard models are evaluated for convenience, even when other models might be superior. Likelihood-f… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
115
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 49 publications
(123 citation statements)
references
References 84 publications
2
115
0
Order By: Relevance
“…Our model also provides process-level insight into how attractor networks can account for the dynamics of attentional control, parsimoniously explaining both within-trial gain dynamics and, critically, how adaptation alters the initial conditions of these feature gains. Future research should directly fit this model to participants’ behavior, potentially leveraging recent advances in the efficient estimation of parameters for complex decision models (Fengler, Govindarajan, Chen, & Frank, 2021), allowing for finer grained comparisons between different accounts of dynamic attentional control.…”
Section: Discussionmentioning
confidence: 99%
“…Our model also provides process-level insight into how attractor networks can account for the dynamics of attentional control, parsimoniously explaining both within-trial gain dynamics and, critically, how adaptation alters the initial conditions of these feature gains. Future research should directly fit this model to participants’ behavior, potentially leveraging recent advances in the efficient estimation of parameters for complex decision models (Fengler, Govindarajan, Chen, & Frank, 2021), allowing for finer grained comparisons between different accounts of dynamic attentional control.…”
Section: Discussionmentioning
confidence: 99%
“…This version of the DDM is the ideal candidate for comparing the performance of different inference methods because the likelihood of an observation given the parameters, L ( x | θ ), can be calculated analytically (Navarro and Fuss, 2009, in contrast to more complicated versions of the DDM, e.g., Ratcliff and Rouder (1998); Usher and McClelland (2001); Reynolds and Rhodes (2009)). We evaluated MNLE’s performance, and compared against the publicly available pre-trained LAN neural networks (Fengler et al, 2021), based on the analytical likelihoods and the inferred posteriors of the DDM.…”
Section: Resultsmentioning
confidence: 99%
“…More recent approaches from the field simulation-based inference (SBI, Cranmer et al, 2020) have the potential to overcome these limitations by using machine learning algorithms such as neural networks. Recently, Fengler et al (2021) presented an SBI-algorithm for a specific problem in cognitive neuroscience—inference for drift-diffusion models (DDM). They introduced a new approach, called likelihood approximation networks (LANs), which uses neural networks to predict log-likelihoods from data and parameters.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations