This research finds evidence for reliable individual differences in people’s perceived attitude stability that predict the actual stability of their attitudes over time. Study 1 examines the reliability and factor structure of an 11-item Personal Attitude Stability Scale (PASS). Study 2 establishes test–retest reliability for the PASS over a 5-week period. Studies 3a and 3b demonstrate the convergent and discriminant validity of the PASS in relation to relevant existing individual differences. Studies 4 and 5 show that the PASS predicts attitude stability following a delay period across several distinct topics. Across multiple attitude objects, for people with high (vs. low) scores on the PASS, Time 1 attitudes were more predictive of their Time 2 attitudes, indicative of greater attitudinal consistency over time. The final study also demonstrates that the PASS predicts attitude stability above and beyond other related scales.
People often overestimate their understanding of how things work. For instance, people believe they can explain even ordinary phenomena such as the operation of zippers and speedometers in greater depth than they really can. This is called the illusion of explanatory depth. Fortunately, a person can expose the illusion by attempting to generate a causal explanation for how the phenomenon operates (e.g., how a zipper works). Researchers have assumed for two decades that explanation exposes the illusion because explanation makes salient the gaps in a person’s knowledge of that phenomenon. However, recent evidence suggests that people might be able to expose the illusion by instead explaining a different phenomenon. If true, this would challenge our fundamental understanding of how the illusion works. Across three preregistered studies we tested whether the process of explaining one phenomenon (e.g., how a zipper works) would lead someone to report knowing less about a completely different phenomenon (e.g., how snow forms). In each study we found that explaining led people to report knowing less about various phenomena, regardless of what was explained. For example, people reported knowing less about how snow forms after attempting to explain how a zipper works. We discuss alternative accounts of the illusion of explanatory depth that might better fit our results. We also consider the utility of explanation as an indirect, non-confrontational debiasing method in which a person generalizes a feeling of ignorance about one phenomenon to their knowledge base more generally.
No abstract
People often overestimate their understanding of how things work. For instance, people believe that they can explain even ordinary phenomena such as the operation of zippers and speedometers in greater depth than they really can. This is called the illusion of explanatory depth. Fortunately, a person can expose the illusion by attempting to generate a causal explanation for how the phenomenon operates (e.g., how a zipper works). This might be because explanation makes salient the gaps in a person’s knowledge of that phenomenon. However, recent evidence suggests that people might be able to expose the illusion by instead explaining a different phenomenon. Across three preregistered experiments, we tested whether the process of explaining one phenomenon (e.g., how a zipper works) would lead someone to report knowing less about a completely different phenomenon (e.g., how snow forms). In each experiment, we found that attempting to explain one phenomenon led participants to report knowing less about various phenomena. For example, participants reported knowing less about how snow forms after attempting to explain how a zipper works. We discuss alternative accounts of the illusion of explanatory depth that might better fit our results. We also consider the utility of explanation as an indirect, non-confrontational debiasing method in which a person generalizes a feeling of ignorance about one phenomenon to their knowledge base more generally.
During the COVID-19 pandemic, public health guidance (e.g., regarding the use of non-medical masks) changed over time. Although many revisions were a result of gains in scientific understanding, we nonetheless hypothesized that making changes in guidance salient would negatively affect evaluations of experts and health-protective intentions. In Study 1 (N = 300), we demonstrate that describing COVID-19 guidance in terms of inconsistency (versus consistency) leads people to perceive scientists and public health authorities less favorably (e.g., as less expert). For participants in Canada (n = 190), though not the U.S. (n = 110), making guidance change salient also reduced intentions to download a contact tracing app. In Study 2 (N = 1399), we show that a brief forewarning intervention mitigates detrimental effects of changes in guidance. In the absence of forewarning, emphasizing inconsistency harmed judgments of public health authorities and reduced health-protective intentions, but forewarning eliminated this effect.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.