Objectives This study proposes methods for blending design components of clinical effectiveness and implementation research. Such blending can provide benefits over pursuing these lines of research independently; for example, more rapid translational gains, more effective implementation strategies, and more useful information for decision makers. This study proposes a “hybrid effectiveness-implementation” typology, describes a rationale for their use, outlines the design decisions that must be faced, and provides several real-world examples. Results An effectiveness-implementation hybrid design is one that takes a dual focus a priori in assessing clinical effectiveness and implementation. We propose 3 hybrid types: (1) testing effects of a clinical intervention on relevant outcomes while observing and gathering information on implementation; (2) dual testing of clinical and implementation interventions/strategies; and (3) testing of an implementation strategy while observing and gathering information on the clinical intervention’s impact on relevant outcomes. Conclusions The hybrid typology proposed herein must be considered a construct still in evolution. Although traditional clinical effectiveness and implementation trials are likely to remain the most common approach to moving a clinical intervention through from efficacy research to public health impact, judicious use of the proposed hybrid designs could speed the translation of research findings into routine practice.
BackgroundRealist synthesis is an increasingly popular approach to the review and synthesis of evidence, which focuses on understanding the mechanisms by which an intervention works (or not). There are few published examples of realist synthesis. This paper therefore fills a gap by describing, in detail, the process used for a realist review and synthesis to answer the question ‘what interventions and strategies are effective in enabling evidence-informed healthcare?’ The strengths and challenges of conducting realist review are also considered.MethodsThe realist approach involves identifying underlying causal mechanisms and exploring how they work under what conditions. The stages of this review included: defining the scope of the review (concept mining and framework formulation); searching for and scrutinising the evidence; extracting and synthesising the evidence; and developing the narrative, including hypotheses.ResultsBased on key terms and concepts related to various interventions to promote evidence-informed healthcare, we developed an outcome-focused theoretical framework. Questions were tailored for each of four theory/intervention areas within the theoretical framework and were used to guide development of a review and data extraction process. The search for literature within our first theory area, change agency, was executed and the screening procedure resulted in inclusion of 52 papers. Using the questions relevant to this theory area, data were extracted by one reviewer and validated by a second reviewer. Synthesis involved organisation of extracted data into evidence tables, theming and formulation of chains of inference, linking between the chains of inference, and hypothesis formulation. The narrative was developed around the hypotheses generated within the change agency theory area.ConclusionsRealist synthesis lends itself to the review of complex interventions because it accounts for context as well as outcomes in the process of systematically and transparently synthesising relevant literature. While realist synthesis demands flexible thinking and the ability to deal with complexity, the rewards include the potential for more pragmatic conclusions than alternative approaches to systematic reviewing. A separate publication will report the findings of the review.
BackgroundBased on a critical synthesis of literature on use of the Promoting Action on Research Implementation in Health Services (PARIHS) framework, revisions and a companion Guide were developed by a group of researchers independent of the original PARIHS team. The purpose of the Guide is to enhance and optimize efforts of researchers using PARIHS in implementation trials and evaluations.MethodsAuthors used a planned, structured process to organize and synthesize critiques, discussions, and potential recommendations for refinements of the PARIHS framework arising from a systematic review. Using a templated form, each author independently recorded key components for each reviewed paper; that is, study definitions, perceived strengths/limitations of PARIHS, other observations regarding key issues and recommendations regarding needed refinements. After reaching consensus on these key components, the authors summarized the information and developed the Guide.ResultsA number of revisions, perceived as consistent with the PARIHS framework's general nature and intent, are proposed. The related Guide is composed of a set of reference tools, provided in Additional files. Its core content is built upon the basic elements of PARIHS and current implementation science.ConclusionsWe invite researchers using PARIHS for targeted evidence-based practice (EBP) implementations with a strong task-orientation to use this Guide as a companion and to apply the revised framework prospectively and comprehensively. Researchers also are encouraged to evaluate its use relative to perceived strengths and issues. Such evaluations and critical reflections regarding PARIHS and our Guide could thereby promote the framework's continued evolution.
Background: Facilitation has been identified in the literature as a potentially key component of successful implementation. It has not, however, either been well-defined or well-studied. Significant questions remain about the operational definition of facilitation and about the relationship of facilitation to other interventions, especially to other change agent roles when used in multi-faceted implementation projects.Researchers who are part of the Quality Enhancement Research Initiative (QUERI) are actively exploring various approaches and processes, including facilitation, to enable implementation of best practices in the Veterans Health Administration health care system -the largest integrated healthcare system in the United States. This paper describes a systematic, retrospective evaluation of implementation-related facilitation experiences within QUERI, a quality improvement program developed by the US Department of Veterans Affairs. Methods:A post-hoc evaluation was conducted through a series of semi-structured interviews to examine the concept of facilitation across several multi-site QUERI implementation studies. The interview process is based on a technique developed in the field of education, which systematically enhances learning through experience by stimulating recall and reflection regarding past complex activities. An iterative content analysis approach relative to a set of conceptually-based interview questions was used for data analysis.Findings: Findings suggest that facilitation, within an implementation study initiated by a central change agency, is a deliberate and valued process of interactive problem solving and support that occurs in the context of a recognized need for improvement and a supportive interpersonal relationship. Facilitation was described primarily as a distinct role with a number of potentially crucial behaviors and activities. Data further suggest that external facilitators were likely to use or integrate other implementation interventions, while performing this problem-solving and supportive role. Preliminary Conclusions: This evaluation provides evidence to suggest that facilitation could be considered a distinct implementation intervention, just as audit and feedback, educational outreach, or similar methods are considered to be discrete interventions. As such, facilitation should be well-defined and explicitly evaluated for its perceived usefulness within multi-intervention implementation projects. Additionally, researchers should better define the specific contribution of facilitation to the success of implementation in different types of projects, different types of sites, and with evidence and innovations of varying levels of strength and complexity. BackgroundImplementation of research findings into practice is a complex undertaking that has often fallen short of expectations. In part, this is due to the current lack of substantive knowledge regarding both individual implementation interventions and the interrelationship of multiple interventions used in m...
BackgroundThe Promoting Action on Research Implementation in Health Services framework, or PARIHS, is a conceptual framework that posits key, interacting elements that influence successful implementation of evidence-based practices. It has been widely cited and used as the basis for empirical work; however, there has not yet been a literature review to examine how the framework has been used in implementation projects and research. The purpose of the present article was to critically review and synthesize the literature on PARIHS to understand how it has been used and operationalized, and to highlight its strengths and limitations.MethodsWe conducted a qualitative, critical synthesis of peer-reviewed PARIHS literature published through March 2009. We synthesized findings through a three-step process using semi-structured data abstraction tools and group consensus.ResultsTwenty-four articles met our inclusion criteria: six core concept articles from original PARIHS authors, and eighteen empirical articles ranging from case reports to quantitative studies. Empirical articles generally used PARIHS as an organizing framework for analyses. No studies used PARIHS prospectively to design implementation strategies, and there was generally a lack of detail about how variables were measured or mapped, or how conclusions were derived. Several studies used findings to comment on the framework in ways that could help refine or validate it. The primary issue identified with the framework was a need for greater conceptual clarity regarding the definition of sub-elements and the nature of dynamic relationships. Strengths identified included its flexibility, intuitive appeal, explicit acknowledgement of the outcome of 'successful implementation,' and a more expansive view of what can and should constitute 'evidence.'ConclusionsWhile we found studies reporting empirical support for PARIHS, the single greatest need for this and other implementation models is rigorous, prospective use of the framework to guide implementation projects. There is also need to better explain derived findings and how interventions or measures are mapped to specific PARIHS elements; greater conceptual discrimination among sub-elements may be necessary first. In general, it may be time for the implementation science community to develop consensus guidelines for reporting the use and usefulness of theoretical frameworks within implementation studies.
This article describes the importance and role of 4 stages of formative evaluation in our growing understanding of how to implement research findings into practice in order to improve the quality of clinical care. It reviews limitations of traditional approaches to implementation research and presents a rationale for new thinking and use of new methods. Developmental, implementation-focused, progress-focused, and interpretive evaluations are then defined and illustrated with examples from Veterans Health Administration Quality Enhancement Research Initiative projects. This article also provides methodologic details and highlights challenges encountered in actualizing formative evaluation within implementation research.
This article describes the importance and role of 4 stages of formative evaluation in our growing understanding of how to implement research findings into practice in order to improve the quality of clinical care. It reviews limitations of traditional approaches to implementation research and presents a rationale for new thinking and use of new methods. Developmental, implementation-focused, progress-focused, and interpretive evaluations are then defined and illustrated with examples from Veterans Health Administration Quality Enhancement Research Initiative projects. This article also provides methodologic details and highlights challenges encountered in actualizing formative evaluation within implementation research.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.