In an effort to responsibly incorporate evidence based on single-case designs (SCDs) into the What Works Clearinghouse (WWC) evidence base, the WWC assembled a panel of individuals with expertise in quantitative methods and SCD methodology to draft SCD standards. In this article, the panel provides an overview of the SCD standards recommended by the panel (henceforth referred to as the Standards) and adopted in Version 1.0 of the WWC's official pilot standards. The Standards are sequentially applied to research studies that incorporate SCDs. The design standards focus on the methodological soundness of SCDs, whereby reviewers assign the categories of Meets Standards, Meets Standards With Reservations, and Does Not Meet Standards to each study. Evidence criteria focus on the credibility of the reported evidence, whereby the outcome measures that meet the design standards (with or without reservations) are examined by reviewers trained in visual analysis and categorized as demonstrating Strong Evidence, Moderate Evidence, or No Evidence. An illustration of an actual research application of the Standards is provided. Issues that the panel did not address are presented as priorities for future consideration. Implications for research and the evidence-based practice movement in psychology and education are discussed. The WWC's Version 1.0 SCD standards are currently being piloted in systematic reviews conducted by the WWC. This document reflects the initial standards recommended by the authors as well as the underlying rationale for those standards. It should be noted that the WWC may revise the Version 1.0 standards based on the results of the pilot; future versions of the WWC standards can be found at http://www.whatworks.ed.gov.
The purpose of this article is to demonstrate the application of mixed methods research designs to multiyear programmatic research and development projects whose goals include integration of cultural specificity when generating or translating evidence-based practices. The authors propose a set of five mixed methods designs related to different phases of program development research: (a) formative research, Qual →/+ Quan; (b) theory development or modification and testing, Qual → Quan →/+ Qual → Quan . . . Qual → Quan; (c) instrument development and validation, Qual → Quan; (d) program development and evaluation, Qual →/+ Quan →/+ Qual →/+ Quan . . . Qual →/+ Quan, or Qual →← Quan; and (e) evaluation research, Qual + Quan. We illustrate the application of these designs to creating and validating ethnographically informed psychological assessment measures and developing and evaluating culturally specific intervention programs within a multiyear research program conducted in the country of Sri Lanka.
This article expands on an emerging mixed-method approach for validating culturally-specific constructs (see Hitchcock et al., 2005). Previous work established an approach for dealing with cultural impacts when assessing psychological constructs and the current article extends these efforts into studying stress reactions among adolescents in Sri Lanka. Ethnographic data collection and analysis techniques were used to construct scenarios that are stressful to Sri Lankan youth, along with survey items that assess their related coping mechanisms. The data were factor analysed, results were triangulated with qualitative findings, and reliability estimates of resulting scales were obtained. This in turn generated a pilot assessment approach that can be used to measure stress and coping reactions in a distinct culture. Use of the procedures described here could be replicated to generate culturally-specific instruments in international contexts, or when working with ethnic minorities within a given nation. This should in turn generate information needed to develop culturally relevant intervention work.
This article uses the Comprehensive Mixed-Methods Participatory Evaluation (CMMPE; Nastasi and Hitchcock Transforming school mental health services: Population-based approaches to promoting the competency and wellness of children, Thousand Oaks, CA: Corwin Press with National Association of School Psychologists 2008; Nastasi et al. School-based mental health services: creating comprehensive and culturally specific programs. Washington, DC: American Psychological Association 2004) model as a framework for addressing the multiplicity of evaluation decisions and complex nature of questions related to program success in multilevel interventions. CMMPE defines program success in terms of acceptability, integrity, social or cultural validity, outcomes (impact), sustainability and institutionalization, thus broadening the traditional notions of program outcomes. The authors use CMMPE and an example of a community-based multilevel sexual risk prevention program with multiple outcomes to discuss challenges of evaluating multilevel interventions. The sexual risk program exemplifies what Schensul and Trickett (this issue) characterize as multilevel intervention-multilevel evaluation (M-M), with both intervention and evaluation at community, health practitioner, and patient levels. The illustration provides the context for considering several challenges related to M-M designs: feasibility of randomized controlled trials within community-based multilevel intervention; acceptability and social or cultural validity of evaluation procedures; implementer, recipient, and contextual variations in program success; interactions among levels of the intervention; unanticipated changes or conditions; multiple indicators of program success; engaging multiple stakeholders in a participatory process; and evaluating sustainability and institutionalization. The complexity of multilevel intervention and evaluation designs challenges traditional notions of evaluation research and experimental designs. Overcoming these challenges is critical to effective translation of research to practice in psychology and related disciplines.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.