2016
DOI: 10.3758/s13428-016-0746-9
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian inference with Stan: A tutorial on adding custom distributions

Abstract: When evaluating cognitive models based on fits to observed data (or, really, any model that has free parameters), parameter estimation is critically important. Traditional techniques like hill climbing by minimizing or maximizing a fit statistic often result in point estimates. Bayesian approaches instead estimate parameters as posterior probability distributions, and thus naturally account for the uncertainty associated with parameter estimation; Bayesian approaches also offer powerful and principled methods … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
81
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 75 publications
(82 citation statements)
references
References 27 publications
0
81
0
Order By: Relevance
“…As such, interest in the Bayesian approach to evaluating cognitive models has grown significantly over the past few years. Some applications include evaluating variants of signal detection theory (e.g., Refs ), multinomial processing trees (e.g., Ref ), individual differences (e.g., Refs ), decision making (e.g., Refs ), multidimensional scaling (e.g., Refs ), choice response time (e.g., Refs ), memory (e.g., Refs ), and joint modeling of neural and behavioral data (e.g., Refs ).…”
Section: Introducing the Bayesian Statistical Approachmentioning
confidence: 99%
See 2 more Smart Citations
“…As such, interest in the Bayesian approach to evaluating cognitive models has grown significantly over the past few years. Some applications include evaluating variants of signal detection theory (e.g., Refs ), multinomial processing trees (e.g., Ref ), individual differences (e.g., Refs ), decision making (e.g., Refs ), multidimensional scaling (e.g., Refs ), choice response time (e.g., Refs ), memory (e.g., Refs ), and joint modeling of neural and behavioral data (e.g., Refs ).…”
Section: Introducing the Bayesian Statistical Approachmentioning
confidence: 99%
“…For many, if not most, cognitive models, it is necessary to implement custom probability distributions. WinBUGS and JAGS allow this, but require relatively low‐level programming in C++; Stan allows this within the same Stan programming language directly …”
Section: Introducing the Bayesian Statistical Approachmentioning
confidence: 99%
See 1 more Smart Citation
“…To use a distribution in a differentiable model, the distribution's log probability density function must be differentiable by both the distribution's parameters and by the value of the random variable. Distributions are usually supplied as a part of the probabilistic programming framework [10,11,51], and, if at all possible, adding a user-defined distribution requires programming at a lower level of abstraction than while specifying a probabilistic model [1,41,51].…”
Section: Model Compositionmentioning
confidence: 99%
“…Consider first a Bayesian hierarchical version of the diffusion model (e.g., Vandekerckhove, Tuerlinckx, & Lee, 2011) or of the linear ballistic accumulator model (e.g., Annis, Miller, & Palmeri, in press) of perceptual decision making. Recall that these models explain variability in response times and response probabilities via a combination of drift rate, response threshold, starting point, non-decision time (for debates regarding variability in these parameters, see Heathcote, Wagenmakers, & Scott, 2014; Jones & Dzhafarov, 2014; Smith, Ratcliff, & McKoon, 2014).…”
Section: Modeling Individual Differences In Categorizationmentioning
confidence: 99%