The Cochrane Qualitative and Implementation Methods Group (CQIMG) develops and publishes guidance on the synthesis of qualitative and mixed-method evidence from process evaluations. Despite a proliferation of methods for the synthesis of qualitative research, less attention has focused on how to integrate these syntheses within intervention effectiveness reviews. In this paper we report updated guidance from the group on approaches, methods and tools which can be used to integrate the findings from quantitative studies evaluating intervention effectiveness with those from qualitative studies and process evaluations. We draw on conceptual analyses of mixed methods systematic review designs and the range of methods and tools that have been used in published reviews that have successfully integrated different types of evidence. We outline five key methods and tools as devices for integration which vary in terms of the levels at which integration takes place; the specialist skills and expertise required within the review team; and their appropriateness in the context of limited evidence. In situations where the requirement is the integration of qualitative and process evidence within intervention effectiveness reviews, we recommend the use of a sequential approach.Here evidence from each tradition is synthesised separately using methods consistent with each tradition before integration takes place using a common framework. Reviews which integrate qualitative and process evaluation evidence alongside quantitative evidence on intervention effectiveness in a systematic way are rare. This guidance aims to support review teams to achieve integration and we encourage further development through reflection and formal testing. • Systematic reviews which integrate qualitative and process evaluation evidence alongside quantitative evidence on intervention effectiveness are rare.• We offer guidance on the range of approaches, methods and tools which can be specifically applied to the stage within a review at which the findings from each type of research are integrated.• We identify, outline and compare and contrast the characteristics of five key methods and tools as devices for integration which have been tested within reviews.• Review teams can use this guidance to help them choose the most appropriate method for their context.
The concept of validity has been a central component in critical appraisal exercises evaluating the methodological quality of quantitative studies. Reactions by qualitative researchers have been mixed in relation to whether or not validity should be applied to qualitative research and if so, what criteria should be used to distinguish high-quality articles from others. We compared three online critical appraisal instruments' ability to facilitate an assessment of validity. Many reviewers have used the critical appraisal skills program (CASP) tool to complete their critical appraisal exercise; however, CASP appears to be less sensitive to aspects of validity than the evaluation tool for qualitative studies (ETQS) and the Joanna Briggs Institute (JBI) tool. The ETQS provides detailed instructions on how to interpret criteria; however, it is the JBI tool, with its focus on congruity, that appears to be the most coherent.
The real verification of the 'lines of action' suggested in a meta-aggregation consists of the satisfactorily ending consequences, mental or physical, which the synthesized statements that summarize the basic ideas emerging from the studies are able to generate in end users.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.