2019
DOI: 10.2139/ssrn.3496098
|View full text |Cite
|
Sign up to set email alerts
|

Estimating Parameters of Structural Models Using Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
2
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 43 publications
0
9
0
Order By: Relevance
“…For models with tractable likelihood, we performed the same model identification process using AIC (Akaike, 1998) that relies on likelihood computation, penalized by number of parameters, to quantify model fitness as a benchmark. We note that another common criterion, BIC (Wei and Jiang, 2022), performed more poorly than AIC in our case.The best fitting model is identified based on the lowest AIC score - a successful model recovery would indicate that the true model has the lowest AIC score compared to other models fit to that data. To construct the confusion matrix, we computed best AIC score proportions for all models, across all agents, for data sets simulated from each cognitive model (Fig: 5; see methods).…”
Section: Resultsmentioning
confidence: 60%
See 2 more Smart Citations
“…For models with tractable likelihood, we performed the same model identification process using AIC (Akaike, 1998) that relies on likelihood computation, penalized by number of parameters, to quantify model fitness as a benchmark. We note that another common criterion, BIC (Wei and Jiang, 2022), performed more poorly than AIC in our case.The best fitting model is identified based on the lowest AIC score - a successful model recovery would indicate that the true model has the lowest AIC score compared to other models fit to that data. To construct the confusion matrix, we computed best AIC score proportions for all models, across all agents, for data sets simulated from each cognitive model (Fig: 5; see methods).…”
Section: Resultsmentioning
confidence: 60%
“…By specifying model equations, researchers can inject different theoretical assumptions into models, and, for most models, simulate synthetic data to make predictions and compare against observed behavior. Researchers can quantitatively arbitrate between different theories by comparing goodness of fit (Akaike, 1998, Wei and Jiang, 2022) across different models. Furthermore, by estimating model parameters that quantify underlying cognitive processes, researchers have been able to characterize important individual differences (e.g.…”
Section: Figurementioning
confidence: 99%
See 1 more Smart Citation
“…Finally, it is worth mentioning that other recent work has considered the combination of deep learning and some form of structural modeling (examples include Wei and Jiang, 2019;Igami, 2020;Kaji et al, 2020;Chen et al, 2021). Typically, the goal is estimation of a parametric structural model and deep learning methods are applied to learn the mapping of data to parameters.…”
Section: Structured Deep Learning For Parameter Functionsmentioning
confidence: 99%
“…Finally, it is worth mentioning that other recent work has considered the combination of deep learning and structural modeling. Typically, the goal is estimation of a parametric structural model and deep learning methods are applied to learn the mapping of data to parameters (Wei and Jiang, 2019;Kaji et al, 2020;Igami, 2020). Our focus, using deep learning to estimate individual-level heterogeneity, is quite different, and further, we give theoretical results on deep neural network estimation and subsequent inference which are not available in prior work.…”
Section: Deep Thoughtsmentioning
confidence: 99%