There is often a curious distinction between what the scientific community and the general population believe to be true of dire scientific issues, and this skepticism tends to vary markedly across groups. For instance, in the case of climate change, Republicans (conservatives) are especially skeptical of the relevant science, particularly when they are compared with Democrats (liberals). What causes such radical group differences? We suggest, as have previous accounts, that this phenomenon is often motivated. However, the source of this motivation is not necessarily an aversion to the problem, per se, but an aversion to the solutions associated with the problem. This difference in underlying process holds important implications for understanding, predicting, and influencing motivated skepticism. In 4 studies, we tested this solution aversion explanation for why people are often so divided over evidence and why this divide often occurs so saliently across political party lines. Studies 1, 2, and 3-using correlational and experimental methodologies-demonstrated that Republicans' increased skepticism toward environmental sciences may be partly attributable to a conflict between specific ideological values and the most popularly discussed environmental solutions. Study 4 found that, in a different domain (crime), those holding a more liberal ideology (support for gun control) also show skepticism motivated by solution aversion.This article may not exactly replicate the final version published in the APA journal. It is not the copy of record.Many serious local, national, and global problems exist today. For instance, physical and social scientists have identified environmental problems such as climate change, rising crime rates, and emerging health epidemics as requiring immediate, proactive intervention. However, even in cases in which there is little scientific debate, substantial skepticism exists in the general populace, with experiments and polling data showing that groups of people vary widely in the degree to which they dispute these facts (Pew Research Center, 2010;Schuldt, Konrath, & Schwarz, 2011).Why do some people, in some domains, appear so especially distrustful of conclusions that scientists themselves agree upon? Several interesting perspectives have been offered to help explain patterns of scientific denial, including heightened sensitivity to negative information (Carraro, Castelli, & Macchiella, 2011; Oxley et al., 2008), dispositional motivated cognition differences (Jost, Glaser, Kruglanski, & Sulloway, 2003), and conspiratorial mindsets We propose a motivation behind the denial of many of today's problems that is rooted not in a fear of the general problem, per se, but rather in fear of the specific solutions associated with that problem. Building on and integrating the growing literatures addressing the psychology of ideological motivations (Carney, Jost,
We introduce a simple solution to help consumers manage choices between healthy and unhealthy food options: vice-virtue bundles. Vice-virtue bundles are item aggregates with varying proportions of both vice and virtue, holding overall quantity constant. Four studies compare choice and perceptions of differently composed vice-virtue bundles relative to one another and to pure vice and pure virtue options. Although multiple consumer segments can be identified, results suggest that people overall tend to prefer vice-virtue bundles with small (¼) to medium (½) proportions of vice rather than large (¾) proportions of vice. Moreover, people generally rate vice-virtue bundles with small vice proportions as healthier but similarly tasty as bundles with larger vice proportions. For most individuals, choice patterns are different from those predicted by variety-seeking accounts alone. Instead, these findings provide evidence of asymmetric effectiveness of small vice and virtue proportions at addressing taste and health goals, respectively.
We propose that people may gain certain "offensive" and "defensive" advantages for their cherished belief systems (e.g., religious and political views) by including aspects of unfalsifiability in those belief systems, such that some aspects of the beliefs cannot be tested empirically and conclusively refuted. This may seem peculiar, irrational, or at least undesirable to many people because it is assumed that the primary purpose of a belief is to know objective truth. However, past research suggests that accuracy is only one psychological motivation among many, and falsifiability or testability may be less important when the purpose of a belief serves other psychological motives (e.g., to maintain one's worldviews, serve an identity). In Experiments 1 and 2 we demonstrate the "offensive" function of unfalsifiability: that it allows religious adherents to hold their beliefs with more conviction and political partisans to polarize and criticize their opponents more extremely. Next we demonstrate unfalsifiability's "defensive" function: When facts threaten their worldviews, religious participants frame specific reasons for their beliefs in more unfalsifiable terms (Experiment 3) and political partisans construe political issues as more unfalsifiable ("moral opinion") instead of falsifiable ("a matter of facts"; Experiment 4). We conclude by discussing how in a world where beliefs and ideas are becoming more easily testable by data, unfalsifiability might be an attractive aspect to include in one's belief systems, and how unfalsifiability may contribute to polarization, intractability, and the marginalization of science in public discourse.
The pursuit of passion in one’s work is touted in contemporary discourse. Although passion may indeed be beneficial in many ways, we suggest that the modern cultural emphasis may also serve to facilitate the legitimization of unfair and demeaning management practices—a phenomenon we term the legitimization of passion exploitation. Across 7 studies and a meta-analysis, we show that people do in fact deem poor worker treatment (e.g., asking employees to do demeaning tasks that are irrelevant to their job description, asking employees to work extra hours without pay) as more legitimate when workers are presumed to be “passionate” about their work. Of importance, we demonstrate 2 mediating mechanisms by which this process of legitimization occurs: (a) assumptions that passionate workers would have volunteered for this work if given the chance (Studies 1, 3, 5, 6, and 8), and (b) beliefs that, for passionate workers, work itself is its own reward (Studies 3, 4, 5, 6, and 8). We also find support for the reverse direction of the legitimization process, in which people attribute passion to an exploited (vs. nonexploited) worker (Study 7). Finally, and consistent with the notion that this process is connected to justice motives, a test of moderated mediation shows this is most pronounced for participants high in belief in a just world (Study 8). Taken together, these studies suggest that although passion may seem like a positive attribute to assume in others, it can also license poor and exploitative worker treatment.
People often use their own feelings as a basis to predict others' feelings. For example, when trying to gauge how much someone else enjoys a television show, people might think "How much do I enjoy it?" and use this answer as basis for estimating others' reactions. Although personal experience (such as actually watching the show oneself) often improves empathic accuracy, we found that gaining too much experience can impair it. Five experiments highlight a desensitization bias in emotional perspective taking, with consequences for social prediction, social judgment, and social behavior. Participants who viewed thrilling or shocking images many times predicted first-time viewers would react less intensely (Experiments 1 and 2); participants who heard the same funny joke or annoying noise many times estimated less intense reactions of first-time listeners (Experiments 3 and 4); and further, participants were less likely to actually share good jokes and felt less bad about blasting others with annoying noise after they themselves became desensitized to those events (Experiments 3-5). These effects were mediated by participants' own attenuated reactions. Moreover, observers failed to anticipate this bias, believing that overexposed participants (i.e., repeatedly exposed participants who became desensitized) would make better decisions on their behalf (Experiment 5). Taken together, these findings reveal a novel paradox in emotional perspective taking: If people experience an evocative event many times, they may not become wiser companions but worse, unable to disentangle self-change from other-oriented thinking. Just as lacking exposure to others' experiences can create gaps in empathy and understanding, so may gaining too much.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.