BackgroundRapid reviews have the potential to overcome a key barrier to the use of research evidence in decision making, namely that of the lack of timely and relevant research. This rapid review of systematic reviews and primary studies sought to answer the question: What are the best methodologies to enable a rapid review of research evidence for evidence-informed decision making in health policy and practice?MethodsThis rapid review utilised systematic review methods and was conducted according to a pre-defined protocol including clear inclusion criteria (PROSPERO registration: CRD42015015998). A comprehensive search strategy was used, including published and grey literature, written in English, French, Portuguese or Spanish, from 2004 onwards. Eleven databases and two websites were searched. Two review authors independently applied the eligibility criteria. Data extraction was done by one reviewer and checked by a second. The methodological quality of included studies was assessed independently by two reviewers. A narrative summary of the results is presented.ResultsFive systematic reviews and one randomised controlled trial (RCT) that investigated methodologies for rapid reviews met the inclusion criteria. None of the systematic reviews were of sufficient quality to allow firm conclusions to be made. Thus, the findings need to be treated with caution. There is no agreed definition of rapid reviews in the literature and no agreed methodology for conducting rapid reviews. While a wide range of ‘shortcuts’ are used to make rapid reviews faster than a full systematic review, the included studies found little empirical evidence of their impact on the conclusions of either rapid or systematic reviews. There is some evidence from the included RCT (that had a low risk of bias) that rapid reviews may improve clarity and accessibility of research evidence for decision makers.ConclusionsGreater care needs to be taken in improving the transparency of the methods used in rapid review products. There is no evidence available to suggest that rapid reviews should not be done or that they are misleading in any way. We offer an improved definition of rapid reviews to guide future research as well as clearer guidance for policy and practice.Electronic supplementary materialThe online version of this article (doi:10.1186/s12961-016-0155-7) contains supplementary material, which is available to authorized users.
BackgroundThe objective of this work was to inform the design of a rapid response program to support evidence-informed decision-making in health policy and practice for the Americas region. Specifically, we focus on the following: (1) What are the best methodological approaches for rapid reviews of the research evidence? (2) What other strategies are needed to facilitate evidence-informed decision-making in health policy and practice? and (3) How best to operationalize a rapid response program?MethodsThe evidence used to inform the design of a rapid response program included (i) two rapid reviews of methodological approaches for rapid reviews of the research evidence and strategies to facilitate evidence-informed decision-making, (ii) supplementary literature in relation to the “shortcuts” that could be considered to reduce the time needed to complete rapid reviews, (iii) four case studies, and (iv) supplementary literature to identify additional operational issues for the design of the program.ResultsThere is no agreed definition of rapid reviews in the literature and no agreed methodology for conducting them. Better reporting of rapid review methods is needed. The literature found in relation to shortcuts will be helpful in choosing shortcuts that maximize timeliness while minimizing the impact on quality. Evidence for other strategies that can be used concurrently to facilitate the uptake of research evidence, including evidence drawn from rapid reviews, is presented. Operational issues that need to be considered in designing a rapid response program include the implications of a “user-pays” model, the importance of recruiting staff with the right mix of skills and qualifications, and ensuring that the impact of the model on research use in decision-making is formally evaluated.ConclusionsWhen designing a new rapid response program, greater attention needs to be given to specifying the rapid review methods and reporting these in sufficient detail to allow a quality assessment. It will also be important to engage in other strategies to facilitate the uptake of the rapid reviews and to evaluate the chosen model in order to make refinements and add to the evidence base for evidence-informed decision-making.Electronic supplementary materialThe online version of this article (doi:10.1186/s13012-016-0472-9) contains supplementary material, which is available to authorized users.
Background As a source of readily available evidence, rigorously synthesized and interpreted by expert clinicians and methodologists, clinical guidelines are part of an evidence-based practice toolkit, which, transformed into practice recommendations, have the potential to improve both the process of care and patient outcomes. In Brazil, the process of development and updating of the clinical guidelines for the Brazilian Unified Health System (Sistema Único de Saúde, SUS) is already well systematized by the Ministry of Health. However, the implementation process of those guidelines has not yet been discussed and well structured. Therefore, the first step of this project and the primary objective of this study was to summarize the evidence on the effectiveness of strategies used to promote clinical practice guideline implementation and dissemination. Methods This overview used systematic review methodology to locate and evaluate published systematic reviews regarding strategies for clinical practice guideline implementation and adhered to the PRISMA guidelines for systematic review (PRISMA). Results This overview identified 36 systematic reviews regarding 30 strategies targeting healthcare organizations, healthcare providers and patients to promote guideline implementation. The most reported interventions were educational materials, educational meetings, reminders, academic detailing and audit and feedback. Care pathways—single intervention, educational meeting—single intervention, organizational culture, and audit and feedback—both strategies implemented in combination with others—were strategies categorized as generally effective from the systematic reviews. In the meta-analyses, when used alone, organizational culture, educational intervention and reminders proved to be effective in promoting physicians' adherence to the guidelines. When used in conjunction with other strategies, organizational culture also proved to be effective. For patient-related outcomes, education intervention showed effective results for disease target results at a short and long term. Conclusion This overview provides a broad summary of the best evidence on guideline implementation. Even if the included literature highlights the various limitations related to the lack of standardization, the methodological quality of the studies, and especially the lack of conclusion about the superiority of one strategy over another, the summary of the results provided by this study provides information on strategies that have been most widely studied in the last few years and their effectiveness in the context in which they were applied. Therefore, this panorama can support strategy decision-making adequate for SUS and other health systems, seeking to positively impact on the appropriate use of guidelines, healthcare outcomes and the sustainability of the SUS.
Background While calls for institutionalization of evidence-informed policy-making (EIP) have become stronger in recent years, there is a paucity of methods that governments and organizational knowledge brokers can use to sustain and integrate EIP as part of mainstream health policy-making. The objective of this paper was to conduct a knowledge synthesis of the published and grey literatures to develop a theoretical framework with the key features of EIP institutionalization. Methods We applied a critical interpretive synthesis (CIS) that allowed for a systematic, yet iterative and dynamic analysis of heterogeneous bodies of literature to develop an explanatory framework for EIP institutionalization. We used a “compass” question to create a detailed search strategy and conducted electronic searches to identify papers based on their potential relevance to EIP institutionalization. Papers were screened and extracted independently and in duplicate. A constant comparative method was applied to develop a framework on EIP institutionalization. The CIS was triangulated with the findings of stakeholder dialogues that involved civil servants, policy-makers and researchers. Results We identified 3001 references, of which 88 papers met our eligibility criteria. This CIS resulted in a definition of EIP institutionalization as the “process and outcome of (re-)creating, maintaining and reinforcing norms, regulations, and standard practices that, based on collective meaning and values, actions as well as endowment of resources, allow evidence to become—over time—a legitimate and taken-for-granted part of health policy-making”. The resulting theoretical framework comprised six key domains of EIP institutionalization that capture both structure and agency: (1) governance; (2) standards and routinized processes; (3) partnership, collective action and support; (4) leadership and commitment; (5) resources; and (6) culture. Furthermore, EIP institutionalization is being achieved through five overlapping stages: (i) precipitating events; (ii) de-institutionalization; (iii) semi-institutionalization (comprising theorization and diffusion); (iv) (re)-institutionalization; and (v) renewed de-institutionalization processes. Conclusions This CIS advances the theoretical and conceptual discussions on EIP institutionalization, and provides new insights into an evidence-informed framework for initiating, strengthening and/or assessing efforts to institutionalize EIP.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.