2023
DOI: 10.1101/2023.09.14.557793
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Artificial neural networks for model identification and parameter estimation in computational cognitive models

Milena Rmus,
Ti-Fen Pan,
Liyu Xia
et al.

Abstract: Computational cognitive models have been used extensively to formalize cognitive processes. Model parameters offer a simple way to quantify individual differences in how humans process information. Similarly, model comparison allows researchers to identify which theories, embedded in different models, provide the best accounts of the data. Cognitive modeling uses statistical tools to quantitatively relate models to data that often rely on computing/estimating the likelihood of the data under the model. However… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 90 publications
0
3
0
Order By: Relevance
“…Following the trends of artificial neural networks, deep learning [247][248][249][250][251], and deep RL [252][253][254][255][256][257][258][259][260], recent approaches to cognitive modeling have begun to utilize machine learning via the architecture of a recurrent neural network (RNN) [261][262][263]-such as with a long short-term memory (LSTM) unit [264] or a simpler gated recurrent unit (GRU) [265]-in an attempt to understand core computations for learning (i.e., beyond just nonlinear function approximation for state representation) [266][267][268][269][270][271][272][273][274][275][276][277][278][279]. Whereas such efforts pursue a data-centric approach leveraging predictive power as opposed to the present theory-centric approach PLOS COMPUTATIONAL BIOLOGY leveraging explanatory power, it is the latter that has so far proven more effective for inference about empirical behavior (but see [87,280]).…”
Section: Dynamics Of Hysteresismentioning
confidence: 99%
“…Following the trends of artificial neural networks, deep learning [247][248][249][250][251], and deep RL [252][253][254][255][256][257][258][259][260], recent approaches to cognitive modeling have begun to utilize machine learning via the architecture of a recurrent neural network (RNN) [261][262][263]-such as with a long short-term memory (LSTM) unit [264] or a simpler gated recurrent unit (GRU) [265]-in an attempt to understand core computations for learning (i.e., beyond just nonlinear function approximation for state representation) [266][267][268][269][270][271][272][273][274][275][276][277][278][279]. Whereas such efforts pursue a data-centric approach leveraging predictive power as opposed to the present theory-centric approach PLOS COMPUTATIONAL BIOLOGY leveraging explanatory power, it is the latter that has so far proven more effective for inference about empirical behavior (but see [87,280]).…”
Section: Dynamics Of Hysteresismentioning
confidence: 99%
“…The options model was unfittable through HBI methods due to intractable likelihood [101]. To approximate the learning parameters for each participant (β learn , α learn , and ϕ), we fit a simpler reinforcement learning model (without options), this time through maximum-likelihood methods and only on action selection date, and used the resulting best-fit learning parameters for simulations.…”
Section: E Parameter and Model Recoverabilitymentioning
confidence: 99%
“…These results suggest at least some participants generated insights for learning by recognizing hierarchical relationships in the learning environment, while not necessarily using this information directly to decide which goals to select. Since the options model was not fittable through HBI methods, quantitative confirmation of these results awaits formal model comparison through more advanced model fitting methods [101].…”
Section: G the Role Of Hierarchy In Learning And Goal Selectionmentioning
confidence: 99%