Easily comprehensible summaries of scholarly articles that are provided alongside ‘ordinary’ scientific abstracts, so-called plain language summaries, can be a powerful tool for communicating research findings to a wider audience. Using an experimental within-person-design in a preregistered study (N = 166), we showed that the comprehensibility for laypeople was higher for plain language summaries compared to scientific abstracts in a psychological journal and also found that laypeople actually understood the corresponding information more correctly for plain language summaries. Moreover, in line with the easiness effect of science popularization, individuals perceived plain language summaries as more credible and were more confident about their ability to make a decision based on plain language summaries. If and under which circumstances this higher perceived credibility is justified, is discussed together with other practical implications and theoretical implications of our findings. In sum, our research further strengthens arguments for providing plain language summaries of psychological research findings by demonstrating that they actually work in practice.
Findings from psychological research are usually difficult to interpret for non-experts. Yet, non-experts resort to psychological findings to inform their decisions (e.g., whether to seek a psychotherapeutic treatment or not). Thus, the communication of psychological research to non-expert audiences has received increasing attention over the last years. Plain language summaries (PLS) are abstracts of peer-reviewed journal articles that aim to explain the rationale, methods, findings, and interpretation of a scientific study to non-expert audiences using non-technical language. Unlike media articles or other forms of accessible research summaries, PLS are usually written by the authors of the respective journal article, ensuring that research content is accurately reproduced. In this study, we compared the readability of PLS and corresponding scientific abstracts in a sample of 103 journal articles from two psychological peer-reviewed journals. To assess readability, we calculated four readability indices that quantify text characteristics related to reading comprehension (e.g., word difficulty, sentence length). Analyses of variance revealed that PLS were easier to read than scientific abstracts. This effect emerged in both included journals and across all readability indices. There was only little evidence that this effect differed in magnitude between the included journals. In sum, this study shows that PLS may be an effective instrument for communicating psychological research to non-expert audiences. We discuss future research avenues to increase the quality of PLS and strengthen their role in science communication.
Abstract. Spectacular cases of scientific misconduct have contributed to concerns about the validity of published results in psychology. In our systematic review, we identified 16 studies reporting prevalence estimates of scientific misconduct and questionable research practices (QRPs) in psychological research. Estimates from these studies varied due to differences in methods and scope. Unlike other disciplines, there was no reliable lower bound prevalence estimate of scientific misconduct based on identified cases available for psychology. Thus, we conducted an additional empirical investigation on the basis of retractions in the database PsycINFO. Our analyses showed that 0.82 per 10,000 journal articles in psychology were retracted due to scientific misconduct. Between the late 1990s and 2012, there was a steep increase. Articles retracted due to scientific misconduct were identified in 20 out of 22 PsycINFO subfields. These results show that measures aiming to reduce scientific misconduct should be promoted equally across all psychological subfields.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.