2023
DOI: 10.1038/s41598-023-40278-3
|View full text |Cite
|
Sign up to set email alerts
|

Neural superstatistics for Bayesian estimation of dynamic cognitive models

Lukas Schumacher,
Paul-Christian Bürkner,
Andreas Voss
et al.

Abstract: Mathematical models of cognition are often memoryless and ignore potential fluctuations of their parameters. However, human cognition is inherently dynamic. Thus, we propose to augment mechanistic cognitive models with a temporal dimension and estimate the resulting dynamics from a superstatistics perspective. Such a model entails a hierarchy between a low-level observation model and a high-level transition model. The observation model describes the local behavior of a system, and the transition model specifie… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 71 publications
0
2
0
Order By: Relevance
“…Finally, it should be noted that the version of our method explored here can only compare HMs assuming exchangeable data at each hierarchical level. Although the majority of HMs in social science research follow this probabilistic symmetry, some researchers may want to compare nonexchangeable HMs, for example, to study within-person dynamics (Driver & Voelkle, 2018;Lodewyckx et al, 2011;Schumacher et al, 2022). Fortunately, the modularity of our method allows easy adaptation of the neural network architecture to handle nonexchangeable HMs.…”
Section: Limitations and Outlookmentioning
confidence: 99%
“…Finally, it should be noted that the version of our method explored here can only compare HMs assuming exchangeable data at each hierarchical level. Although the majority of HMs in social science research follow this probabilistic symmetry, some researchers may want to compare nonexchangeable HMs, for example, to study within-person dynamics (Driver & Voelkle, 2018;Lodewyckx et al, 2011;Schumacher et al, 2022). Fortunately, the modularity of our method allows easy adaptation of the neural network architecture to handle nonexchangeable HMs.…”
Section: Limitations and Outlookmentioning
confidence: 99%
“…BayesFlow has been used for amortized Bayesian inference in various areas of applied research, such as epidemiology (Radev et al, 2021), cognitive modeling (Krause et al, 2022;Schumacher et al, 2023;Sokratous et al, 2023;Wieschen et al, 2020), computational psychiatry (D'Alessandro et al, 2020), neuroscience (Ghaderi-Kangavari et al, 2022), particle physics (Bieringer et al, 2021), agent-based econometrics models (Shiono, 2021), seismic imaging (Siahkoohi et al, 2023), user behavior (Moon et al, 2023), structural health monitoring (Zeng et al, 2023), aerospace (Tsilifis et al, 2022) and wind turbine design (Noever-Castelos et al, 2022), micro-electro-mechanical systems testing (Heringhaus et al, 2022), and fractional Brownian motion (Verdier et al, 2022).…”
Section: Model Misspecification Detectionmentioning
confidence: 99%
“…ative models (Radev et al, 2021;Fang et al, 2022). Extensions of this approach are beginning to allow for hierarchical model comparison (Elsemüller et al, 2023), fitting models with time-varying parameters (Schumacher et al, 2023), fitting data with missing values using (variational) autoencoders (McCoy et al, 2018), and fitting joint distributions of multiple types of data (Kvam et al, 2022).…”
Section: Neural Networkmentioning
confidence: 99%
“…New theories that are unconstrained by likelihoods will allow for new and better explanations, predictions, and questions about the cognitive processes that support decision-making (McMullin, 2013). For example, likelihood-free methods can be used to better implement standard models like the leaky competing accumulator model (Miletić et al, 2017), extend existing model with more realistic assumptions like time-varying parameters (Schumacher et al, 2023), or even create and test entirely new theories like dynamic models of pricing behavior (Kvam & Busemeyer, 2020). The value of amortized inference is not only in improved efficiency of estimation and model comparison, but also in expanding the scope of models we can consider when theorizing.…”
Section: Neural Networkmentioning
confidence: 99%