This study expands upon the extant prior meta-analytic literature by exploring previously theorised reasons for the failure of school-based, universal social and emotional learning (SEL) programmes to produce expected results. Eighty-nine studies reporting the effects of school-based, universal SEL programmes were examined for differential effects on the basis of: 1) stage of evaluation (efficacy or effectiveness); 2) involvement from the programme developer in the evaluation (led, involved, independent); and 3) whether the programme was implemented in its country of origin (home or away). A range of outcomes were assessed including: social-emotional competence, attitudes towards self, pro-social behaviour, conduct problems, emotional distress, academic achievement and emotional competence. Differential gains across all three factors were shown, although not always in the direction hypothesised. The findings from the current study demonstrate a revised and more complex relationship between identified factors and dictate major new directions for the field.
Implementation refers to the process by which an intervention is put into practice. Research studies across multiple disciplines, including education, have consistently demonstrated that interventions are rarely implemented as designed and, crucially, that variability in implementation is related to variability in the achievement of expected outcomes. Put simply, implementation matters (Durlak & DuPre, 2008). This paper reviews several key issues in the study of implementation and calls for an increasing emphasis on this often neglected aspect of evaluation research in UK journals. These issues include programme-specific reasons for studying implementation as an intervention passes through the various stages of development, advancing knowledge and understanding about the processes of implementation (including the balance required between fidelity and adaptation, and the range of factors that may facilitate or impede implementation), and improving measurement and assessment of implementation. Through discussion of these issues the case is made for more research that focuses specifically on the examination of implementation in school settings.
This study presents the findings of a systematic review of measures of social and emotional skills for children and young people. The growing attention to this area in recent years has resulted in the development of a large number of measures to aid in the assessment of children and young people. These measures vary on a number of variables relating to implementation characteristics and psychometric properties. The methodology of the review followed the general principles of systematic reviewing, such as systematic search of databases, the adoption of predetermined set of inclusion and exclusion criteria, and a multistage filtering process. The review process resulted in the retention of 12 measures, which are presented and discussed in relation to key issues in this area, including difficulties with the underlying theory and frameworks for social and emotional skills, inconsistent terminology, the scope and distinctiveness of available measures, and more practical issues such as the type of respondent, location, and purpose of measurement.
Analyses of the relationship between levels of implementation and outcomes of school-based social and emotional learning (SEL) interventions are relatively infrequent and are typically narrowly focused. Thus, our objective was to assess the relationship between variability in a range of implementation dimensions and intervention outcomes in the Promoting Alternative Thinking Strategies (PATHS) curriculum. Implementation of PATHS was examined in 69 classrooms across 23 schools in the first year of a major randomized controlled trial. Implementation data were generated via classroom-level structured observations. In addition to factual data on dosage and reach, exploratory factor analysis of observer ratings revealed two distinct implementation dimensions, namely, “quality and participant responsiveness” and “procedural fidelity.” Student social-emotional skills, pro-social behavior, internalizing symptoms, and externalizing problems were captured through child self-report and teacher informant-report surveys (N = 1721). Hierarchical linear modeling of study data revealed that higher implementation quality and participant responsiveness was associated with significantly lower ratings of students’ externalizing problems at 12-month follow-up. Conversely, and contrary to expectations, higher dosage was associated with significantly lower pro-social behavior and social-emotional skills at 12-month follow-up. No significant associations were found between variability in either procedural fidelity or reach and any intervention outcomes. The implications of these findings are discussed, and study limitations are noted.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.