Background Research on causal reasoning often uses group-level data analyses that downplay individual differences and simple reasoning problems that are unrepresentative of everyday reasoning. In three empirical studies, we used an individual differences approach to investigate the cognitive processes people used in fault diagnosis, which is a complex diagnostic reasoning task. After first showing how high-level fault diagnosis strategies can be composed of simpler causal inferences, we discussed how two of these strategies—elimination and inference to the best explanation (IBE)—allow normative performance, which minimizes the number of diagnostic tests, whereas backtracking strategies are less efficient. We then investigated whether the use of normative strategies was infrequent and associated with greater fluid intelligence and positive thinking dispositions and whether normative strategies used slow, analytic processing while non-normative strategies used fast, heuristic processing. Results Across three studies and 279 participants, uses of elimination and IBE were infrequent, and most participants used inefficient backtracking strategies. Fluid intelligence positively predicted elimination and IBE use but not backtracking use. Positive thinking dispositions predicted avoidance of backtracking. After classifying participants into groups that consistently used elimination, IBE, and backtracking, we found that participants who used elimination and IBE made fewer, but slower, diagnostic tests compared to backtracking users. Conclusions Participants’ fault diagnosis performance showed wide individual differences. Use of normative strategies was predicted by greater fluid intelligence and more open-minded and engaged thinking dispositions. Elimination and IBE users made the slow, efficient responses typical of analytic processing. Backtracking users made the fast, inefficient responses suggestive of heuristic processing.
In three experiments using 977 participants, we investigated whether people would show belief bias by letting their prior beliefs on politically charged topics unduly influence their reasoning when updating beliefs based on evidence. Participants saw data from fictional studies and made judgments of how strongly COVID-19 mitigation measures influenced the number of COVID-19 cases (political problems) or a medicine influenced number of headaches (neutral problems). Based on rational Bayesian models using strong versus weak priors to represent biased beliefs about causal strength, we predicted that people who strongly supported the use of mitigation measures (mainly liberals) would overestimate causal strength on political problems relative to neutral problems while those who strongly opposed mitigation measures (mainly conservatives) would underestimate strength on political problems. Results suggested that belief bias is driven more by specific beliefs relevant to the reasoning context than by general attitudinal factors like political ideology. In Experiments 1 and 2, liberals and conservatives who strongly supported mitigation measures overestimated strength on political problems. In Experiment 3, conservatives who strongly opposed the use of mitigation measures underestimated causal strength on political problems and conservatives who supported mitigation measures made higher strength judgments on political problems than those who opposed these measures. Public Significance StatementIn these studies, people made biased judgments about the strength or effectiveness of COVID-19 mitigation measures (including wearing masks and social distancing), but these biases were based more on their prior beliefs about these measures than on their political ideology. Study participants who felt that mitigation measures are very useful-mainly liberals but also some conservatives-overestimated the strength of mitigation measures when interpreting the results of scientific studies, while participants who felt that mitigation measures are not useful at all-some conservatives-underestimated the strength of mitigation measures. These findings suggest that giving the public accurate and trustworthy information about how effective specific public health measures are (including vaccines) will help them make better health decisions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.