There's a difference between someone instantaneously saying "Yes!" when you ask them on a date compared to "...yes." Psychologists and economists have long studied how people can infer preferences from others' choices. However, these models have tended to focus on what people choose and not how long it takes them to make a choice. We present a rational model for inferring preferences from response times, using a Drift Diffusion Model to characterize how preferences influence response time and Bayesian inference to invert this relationship.We test our model's predictions for three experimental questions. Matching model predictions, participants inferred that a decision-maker preferred a chosen item more if the decision-maker spent longer deliberating (Experiment 1), participants predicted a decision-maker's choice in a novel comparison based on inferring the decision-maker's relative preferences from previous response times and choices (Experiment 2), and participants could incorporate information about a decision-maker's mental state of cautious or careless (Experiments 3, 4A, and 4B).
When someone hosts a party, when governments choose an aid program, or when assistive robots decide what meal to serve to a family, decision‐makers must determine how to help even when their recipients have very different preferences. Which combination of people’s desires should a decision‐maker serve? To provide a potential answer, we turned to psychology: What do people think is best when multiple people have different utilities over options? We developed a quantitative model of what people consider desirable behavior, characterizing participants’ preferences by inferring which combination of “metrics” (maximax, maxsum, maximin, or inequality aversion [IA]) best explained participants’ decisions in a drink‐choosing task. We found that participants’ behavior was best described by the maximin metric, describing the desire to maximize the happiness of the worst‐off person, though participant behavior was also consistent with maximizing group utility (the maxsum metric) and the IA metric to a lesser extent. Participant behavior was consistent across variation in the agents involved and tended to become more maxsum‐oriented when participants were told they were players in the task (Experiment 1). In later experiments, participants maintained maximin behavior across multi‐step tasks rather than shortsightedly focusing on the individual steps therein (Experiment 2, Experiment 3). By repeatedly asking participants what choices they would hope for in an optimal, just decision‐maker, and carefully disambiguating which quantitative metrics describe these nuanced choices, we help constrain the space of what behavior we desire in leaders, artificial intelligence systems helping decision‐makers, and the assistive robots and decision‐makers of the future.
There's a difference between someone instantaneously saying "Yes!" when you ask them on a date compared to "...yes." Psychologists and economists have long studied how people can infer preferences from others' choices. However, these models have tended to focus on what people choose and not how long it takes them to make a choice. We present a rational model for inferring preferences from response times, using a Drift Diffusion Model to characterize how preferences influence response time and Bayesian inference to invert this relationship. We test our model's predictions for three experimental questions. Matching model predictions, participants inferred that a decision-maker preferred a chosen item more if the decision-maker spent longer deliberating (Experiment 1), participants predicted a decision-maker's choice in a novel comparison based on inferring the decision-maker's relative preferences from previous response times and choices (Experiment 2), and participants could incorporate information about a decision-maker's mental state of cautious or careless (Experiments 3, 4A, and 4B).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.