Many people enjoy eating meat but dislike causing pain to animals. Dissociating meat from its animal origins may be a powerful way to avoid cognitive dissonance resulting from this 'meat paradox'. Here, we provide the first comprehensive test of this hypothesis, highlighting underlying psychological mechanisms. Processed meat made participants less empathetic towards the slaughtered animal than unprocessed meat (Study 1). When beheaded, a whole roasted pork evoked less empathy (Study 2a) and disgust (Study 2b) than when the head was present. These affective responses, in turn, made participants more willing to eat the roast and less willing to consider an alternative vegetarian dish. Conversely, presenting a living animal in a meat advertisement increased empathy and reduced willingness to eat meat (Study 3). Next, describing industrial meat production as "harvesting" versus "killing" or "slaughtering" indirectly reduced empathy (Study 4). Last, replacing "beef/pork" with "cow/pig" in a restaurant menu increased empathy and disgust, which both equally reduced willingness to eat meat and increased willingness to choose an alternative vegetarian dish (Study 5). In all experiments, effects were strongly mediated by dissociation and interacted with participants' general dissociation tendencies in Study 3 and 5, so that effects were particularly pronounced among participants who generally spend efforts disassociating meat from animals in their daily lives. Together, this line of research demonstrates the large role various culturally-entrenched processes of dissociation play for meat consumption.
Statistical information such as death risk estimates is frequently used for illustrating the magnitude of a problem. Such mortality statistics are however easier to evaluate if presented next to an earlier estimate, as the two data points together will illustrate an upward or downward change. How are people influenced by such changes? In seven experiments, participants read mortality statistics (e.g., number of yearly deaths or expert‐estimated death risks) made at two points of time about various cancer types. Each cancer type was manipulated to have either a downward trajectory (e.g., the estimated death risk was 37% in 2012, and was adjusted downward to 22% in 2014), an upward trajectory (e.g., 7% → 22%), or a flat trajectory (e.g., 22% → 22%). For each cancer type, participants estimated future mortality statistics and rated the perceived severity. They also allocated real money between projects aimed at preventing the different cancer types. Participants’ responses indicated that they thought that a trend made out of two data points would continue in the future. People also perceived cancer types with similar present mortality statistics as more severe and allocated more money to them when they had an upward trajectory compared to a flat or downward trajectory. Although there are boundary conditions, we conclude that people's severity ratings and helping behavior can be influenced by trend information even when such information is based on only two data points.
Past research has revealed a trend effect when people are faced with a revised probabilistic forecast: A forecasted event that has become more (vs. less) certain is taken to signal a trend towards even stronger (weaker) certainty in future revisions of the forecast. The present paper expands this finding by exploring the boundary conditions of the trend effect and how it affects judgments of the forecaster. In Study 1, the trend effect was shown to persist when receivers process the forecast more deliberately, by considering reasons for the revision. In Study 2, trend continuation was predicted even when the two forecasts were made by different experts at different points in time. Study 3 demonstrated that the effect disappears when receivers are given an earlier forecast disrupting the linearity of the trend (e.g., a 60%-70% sequence preceded by a 70% forecast). In Study 4, two forecasters were perceived as more in agreement when revising divergent probabilities in the same rather than in opposite directions. If the event occurs, a forecast with downgraded probability (e.g. from 50% to 40%) was judged to be less accurate than an equally uncertain single forecast (40%).These results demonstrate the robustness of the trend effect based on two forecasts, affecting not only receivers' expectations of what comes next, but also their perceptions of the forecaster and of forecast accuracy. The findings have implications for how people communicate and understand risks and other uncertain events in areas such as climate science, weather prediction, political science and medicine.
Probability estimates can be given as ranges or uncertainty intervals, where often only one of the interval bounds (lower or upper) is specified. For instance, a climate forecast can describe La Niña as having "more than 70% chance" or "less than 90% chance" of occurring. In three experiments, we studied how research participants perceived climate-related forecasts expressed with lower-bound ("over X% chance") or upper-bound ("under Y% chance") probability statements. Results indicate that such single-bound statements give pragmatic information in addition to the numeric probabilities they convey. First, the studies show that these statements are directional, leading the listeners' attention in opposite directions. "Over" statements guide attention towards the possible occurrence of the event and are explained by reasons for why it might happen, while "under" statements direct attention to its possible non-occurrence and are more often explained by reasons for why the target event might not appear, corresponding to positive (it is possible) versus negative (it is uncertain) verbal probabilities. Second, boundaries were found to reveal the forecaster's beliefs and could be perceived as indicative of an increasing or a decreasing trend. Single-bound probability estimates are therefore not neutral communications of probability level but might "leak" information about the speaker's expectations and about past and future developments of the forecast.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.