ContextParticipatory research (PR) is the co-construction of research through partnerships between researchers and people affected by and/or responsible for action on the issues under study. Evaluating the benefits of PR is challenging for a number of reasons: the research topics, methods, and study designs are heterogeneous; the extent of collaborative involvement may vary over the duration of a project and from one project to the next; and partnership activities may generate a complex array of both short- and long-term outcomes.MethodsOur review team consisted of a collaboration among researchers and decision makers in public health, research funding, ethics review, and community-engaged scholarship. We identified, selected, and appraised a large-variety sample of primary studies describing PR partnerships, and in each stage, two team members independently reviewed and coded the literature. We used key realist review concepts (middle-range theory, demi-regularity, and context-mechanism-outcome configurations [CMO]) to analyze and synthesize the data, using the PR partnership as the main unit of analysis.FindingsFrom 7,167 abstracts and 591 full-text papers, we distilled for synthesis a final sample of twenty-three PR partnerships described in 276 publications. The link between process and outcome in these partnerships was best explained using the middle-range theory of partnership synergy, which demonstrates how PR can (1) ensure culturally and logistically appropriate research, (2) enhance recruitment capacity, (3) generate professional capacity and competence in stakeholder groups, (4) result in productive conflicts followed by useful negotiation, (5) increase the quality of outputs and outcomes over time, (6) increase the sustainability of project goals beyond funded time frames and during gaps in external funding, and (7) create system changes and new unanticipated projects and activities. Negative examples illustrated why these outcomes were not a guaranteed product of PR partnerships but were contingent on key aspects of context.ConclusionsWe used a realist approach to embrace the heterogeneity and complexity of the PR literature. This theory-driven synthesis identified mechanisms by which PR may add value to the research process. Using the middle-range theory of partnership synergy, our review confirmed findings from previous PR reviews, documented and explained some negative outcomes, and generated new insights into the benefits of PR regarding conflicts and negotiation between stakeholders, program sustainability and advancement, unanticipated project activity, and the generation of systemic change.
BackgroundRealist evaluation is increasingly used in health services and other fields of research and evaluation. No previous standards exist for reporting realist evaluations. This standard was developed as part of the RAMESES II project. The project’s aim is to produce initial reporting standards for realist evaluations.MethodsWe purposively recruited a maximum variety sample of an international group of experts in realist evaluation to our online Delphi panel. Panel members came from a variety of disciplines, sectors and policy fields. We prepared the briefing materials for our Delphi panel by summarising the most recent literature on realist evaluations to identify how and why rigour had been demonstrated and where gaps in expertise and rigour were evident. We also drew on our collective experience as realist evaluators, in training and supporting realist evaluations, and on the RAMESES email list to help us develop the briefing materials.Through discussion within the project team, we developed a list of issues related to quality that needed to be addressed when carrying out realist evaluations. These were then shared with the panel members and their feedback was sought. Once the panel members had provided their feedback on our briefing materials, we constructed a set of items for potential inclusion in the reporting standards and circulated these online to panel members. Panel members were asked to rank each potential item twice on a 7-point Likert scale, once for relevance and once for validity. They were also encouraged to provide free text comments.ResultsWe recruited 35 panel members from 27 organisations across six countries from nine different disciplines. Within three rounds our Delphi panel was able to reach consensus on 20 items that should be included in the reporting standards for realist evaluations. The overall response rates for all items for rounds 1, 2 and 3 were 94 %, 76 % and 80 %, respectively.ConclusionThese reporting standards for realist evaluations have been developed by drawing on a range of sources. We hope that these standards will lead to greater consistency and rigour of reporting and make realist evaluation reports more accessible, usable and helpful to different stakeholders.
BackgroundCommunity-Based Participatory Research (CBPR) is an approach in which researchers and community stakeholders form equitable partnerships to tackle issues related to community health improvement and knowledge production. Our 2012 realist review of CBPR outcomes reported long-term effects that were touched upon but not fully explained in the retained literature. To further explore such effects, interviews were conducted with academic and community partners of partnerships retained in the review. Realist methodology was used to increase the understanding of what supports partnership synergy in successful long-term CBPR partnerships, and to further document how equitable partnerships can result in numerous benefits including the sustainability of relationships, research and solutions.MethodsBuilding on our previous realist review of CBPR, we contacted the authors of longitudinal studies of academic-community partnerships retained in the review. Twenty-four participants (community members and researchers) from 11 partnerships were interviewed. Realist logic of analysis was used, involving middle-range theory, context-mechanism-outcome configuration (CMOcs) and the concept of the ‘ripple effect’.ResultsThe analysis supports the central importance of developing and strengthening partnership synergy through trust. The ripple effect concept in conjunction with CMOcs showed that a sense of trust amongst CBPR members was a prominent mechanism leading to partnership sustainability. This in turn resulted in population-level outcomes including: (a) sustaining collaborative efforts toward health improvement; (b) generating spin-off projects; and (c) achieving systemic transformations.ConclusionThese results add to other studies on improving the science of CBPR in partnerships with a high level of power-sharing and co-governance. Our results suggest sustaining CBPR and achieving unanticipated benefits likely depend on trust-related mechanisms and a continuing commitment to power-sharing. These findings have implications for building successful CBPR partnerships to address challenging public health problems and the complex assessment of outcomes.Electronic supplementary materialThe online version of this article (doi:10.1186/s12889-015-1949-1) contains supplementary material, which is available to authorized users.
IntroductionRealist evaluation is an increasingly popular methodology in health services research. For realist evaluations (RE) this project aims to: develop quality and reporting standards and training materials; build capacity for undertaking and critically evaluating them; produce resources and training materials for lay participants, and those seeking to involve them.MethodsTo achieve our aims, we will: (1) Establish management and governance infrastructure; (2) Recruit an interdisciplinary Delphi panel of 35 participants with diverse relevant experience of RE; (3) Summarise current literature and expert opinion on best practice in RE; (4) Run an online Delphi panel to generate and refine items for quality and reporting standards; (5) Capture ‘real world’ experiences and challenges of RE—for example, by providing ongoing support to realist evaluations, hosting the RAMESES JISCmail list on realist research, and feeding problems and insights from these into the deliberations of the Delphi panel; (6) Produce quality and reporting standards; (7) Collate examples of the learning and training needs of researchers, students, reviewers and lay members in relation to RE; (8) Develop, deliver and evaluate training materials for RE and deliver training workshops; and (9) Develop and evaluate information and resources for patients and other lay participants in RE (eg, draft template information sheets and model consent forms) and; (10) Disseminate training materials and other resources.Planned outputs: (1) Quality and reporting standards and training materials for RE. (2) Methodological support for RE. (3) Increase in capacity to support and evaluate RE. (4) Accessible, plain-English resources for patients and the public participating in RE.DiscussionThe realist evaluation is a relatively new approach to evaluation and its overall place in the is not yet fully established. As with all primary research approaches, guidance on quality assurance and uniform reporting is an important step towards improving quality and consistency.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.