We use computational modeling to examine the ability of evidence accumulation models to produce the reaction time (RT) distributions and attentional biases found in behavioral and eye-tracking research. We focus on simulating RTs and attention in binary choice with particular emphasis on whether different models can predict the late onset bias (LOB), commonly found in eye movements during choice (sometimes called the gaze cascade). The first finding is that this bias is predicted by models even when attention is entirely random and independent of the choice process. This shows that the LOB is not evidence of a feedback loop between evidence accumulation and attention. Second, we examine models with a relative evidence decision rule and an absolute evidence rule. In the relative models a decision is made once the difference in evidence accumulated for 2 items reaches a threshold. In the absolute models, a decision is made once 1 item accumulates a certain amount of evidence, independently of how much is accumulated for a competitor. Our core result is simple—the existence of the late onset gaze bias to the option ultimately chosen, together with a positively skewed RT distribution means that the stopping rule must be relative not absolute. A large scale grid search of parameter space shows that absolute threshold models struggle to predict these phenomena even when incorporating evidence decay and assumptions of either mutual inhibition or feedforward inhibition.
Decision makers are often unable to choose between the options that they are offered. In these settings they typically defer their decision, that is, delay the decision to a later point in time or avoid the decision altogether. In this paper, we outline eight behavioral findings regarding the causes and consequences of choice deferral that cognitive theories of decision making should be able to capture. We show that these findings can be accounted for by a deferral-based time limit applied to existing sequential sampling models of preferential choice. Our approach to modeling deferral as a time limit in a sequential sampling model also makes a number of novel predictions regarding the interactions between choice probabilities, deferral probabilities, and decision times, and we confirm these predictions in an experiment. Choice deferral is a key feature of everyday decision making, and our paper illustrates how established theoretical approaches can be used to understand the cognitive underpinnings of this important behavioral phenomenon.
In a paper published in Management Science in 2015, Stewart, Reimers, and Harris (SRH) demonstrated that shapes of utility and probability weighting functions could be manipulated by adjusting the distributions of outcomes and probabilities on offer as predicted by the theory of decision by sampling. So marked were these effects that, at face value, they profoundly challenge standard interpretations of preference theoretic models in which such functions are supposed to reflect stable properties of individual risk preferences. Motivated by this challenge, we report an extensive replication exercise based on a series of experiments conducted as a quasi-adversarial collaboration across different labs and involving researchers from both economics and psychology. We replicate the SRH effect across multiple experiments involving changes in many design features; importantly, however, we find that the effect is also present in designs modified so that decision by sampling predicts no effect. Although those results depend on model-based inferences, an alternative analysis using a model-free comparison approach finds no evidence of patterns akin to the SRH effect. On the basis of simulation exercises, we demonstrate that the SRH effect may be a consequence of misspecification biases arising in parameter recovery exercises that fit imperfectly specified choice models to experimental data. Overall, our analysis casts the SRH effect in an entirely new light.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.