“…Based on narrative direct observations and self-report interviews, a thematic analysis identified four intervention implementation themes; (1) 'teacher support', (2) 'teacher supervision', (3) 'intended implementation' and (4) 'established routine'. Collectively, these themes describe the implementing teachers' experiences (Peterson 2016) and classroom context in which the maths app intervention was situated (Biesta 2010;Humphrey et al 2016). These methods are similar to those used in previous implementation process evaluations (Connolly, Keenan, and Urbanska 2018).…”
Section: Discussionmentioning
confidence: 99%
“…However, these qualitative themes do not capture the extent to which implementation varied across the 11 participating schools, nor do they elucidate how the extent of variation across these themes might be associated with children's learning outcomes with the maths apps. A key aspect of implementation science involves examining the variability in intervention implementation across contexts and learning from this variability (Peterson 2016). Phase II of this study sought to achieve this through examining the relationship between the four intervention implementation themes and children's learning outcomes with the maths apps.…”
“…As such, a more systematic, mixed-methods approach to evaluating implementation is needed (Oakley et al 2006) that also affords statistical examination of the impact of the implementation process on learning outcomes (Peterson 2016;Shaffer 2011;Thomas 2016). The current study demonstrates a novel and informative methodology for examining intervention implementation within a determinant theoretical framework in the context of a recent RCT that evaluated a maths app intervention implemented across 11 primary schools (Outhwaite et al 2018).…”
Randomized control trials (RCTs) are commonly regarded as the 'gold standard' for evaluating educational interventions. While this experimental design is valuable in establishing causal relationships between the tested intervention and outcomes, reliance on statistical aggregation typically underplays the situated context in which interventions are implemented. Developing innovative, systematic methods for evaluating implementation and understanding its impact on outcomes is vital to moving educational evaluation research beyond questions of 'what works', towards better understanding the mechanisms underpinning an intervention's effects. The current study presents a pragmatic, two-phased approach that combines qualitative data with quantitative analyses to examine the causal relationships between intervention implementation and outcomes. This new methodological approach is illustrated in the context of a maths app intervention recently evaluated in a RCT across 11 schools. In phase I, four implementation themes were identified; 'teacher support', 'teacher supervision', 'implementation quality', and 'established routine'. In phase II, 'established routine' was found to predict 41% of the variance in children's learning outcomes with the apps. This has significant implications for future scaling. Overall, this new methodological approach offers an innovative method for combining process and impact evaluations when seeking to gain a more nuanced understanding of what works in education and why.
“…Based on narrative direct observations and self-report interviews, a thematic analysis identified four intervention implementation themes; (1) 'teacher support', (2) 'teacher supervision', (3) 'intended implementation' and (4) 'established routine'. Collectively, these themes describe the implementing teachers' experiences (Peterson 2016) and classroom context in which the maths app intervention was situated (Biesta 2010;Humphrey et al 2016). These methods are similar to those used in previous implementation process evaluations (Connolly, Keenan, and Urbanska 2018).…”
Section: Discussionmentioning
confidence: 99%
“…However, these qualitative themes do not capture the extent to which implementation varied across the 11 participating schools, nor do they elucidate how the extent of variation across these themes might be associated with children's learning outcomes with the maths apps. A key aspect of implementation science involves examining the variability in intervention implementation across contexts and learning from this variability (Peterson 2016). Phase II of this study sought to achieve this through examining the relationship between the four intervention implementation themes and children's learning outcomes with the maths apps.…”
“…As such, a more systematic, mixed-methods approach to evaluating implementation is needed (Oakley et al 2006) that also affords statistical examination of the impact of the implementation process on learning outcomes (Peterson 2016;Shaffer 2011;Thomas 2016). The current study demonstrates a novel and informative methodology for examining intervention implementation within a determinant theoretical framework in the context of a recent RCT that evaluated a maths app intervention implemented across 11 primary schools (Outhwaite et al 2018).…”
Randomized control trials (RCTs) are commonly regarded as the 'gold standard' for evaluating educational interventions. While this experimental design is valuable in establishing causal relationships between the tested intervention and outcomes, reliance on statistical aggregation typically underplays the situated context in which interventions are implemented. Developing innovative, systematic methods for evaluating implementation and understanding its impact on outcomes is vital to moving educational evaluation research beyond questions of 'what works', towards better understanding the mechanisms underpinning an intervention's effects. The current study presents a pragmatic, two-phased approach that combines qualitative data with quantitative analyses to examine the causal relationships between intervention implementation and outcomes. This new methodological approach is illustrated in the context of a maths app intervention recently evaluated in a RCT across 11 schools. In phase I, four implementation themes were identified; 'teacher support', 'teacher supervision', 'implementation quality', and 'established routine'. In phase II, 'established routine' was found to predict 41% of the variance in children's learning outcomes with the apps. This has significant implications for future scaling. Overall, this new methodological approach offers an innovative method for combining process and impact evaluations when seeking to gain a more nuanced understanding of what works in education and why.
“…This issue also covers a range of methodological approaches, including systematic reviews and meta-analysis (Green et al 2016;Higgins and Katsipataki 2016), advancements of methods for dealing with the complexity involved in interventions by either improved models (Schweig and Pane 2016;Spybrook, Shi, and Kelcey 2016) and/or integrated approaches (Hanley, Cambers, and Haslam 2016;Peterson 2016).…”
Section: Getting These Together and Next Stepsmentioning
confidence: 99%
“…In order to alleviate these limitations she then proposes 'What works 2.0' which combines the core elements of experimental and improvement science into a strategy to raise educational achievement with the support of evidence from randomized experiments. 'Central to this combined effort is a focus on identifying and testing mechanisms for improving teaching and learning, as applications of principles from the learning sciences' (Peterson 2016, 1, citing Bransford et al 2000and OECD 2007 (1). Similar ideas are shared by Sardar Anwaruddin who approached this from a different stance in the next paper: while addressing the crisis of representation (i.e.…”
Racial and ethnic disproportionality in discipline (REDD) represents a longstanding and pervasive issue in the United States educational system. However, researchers and interventionists have not sufficiently provided educators with appropriate frameworks and feasible tools to disrupt REDD and promote equity. The goal of this paper is to present a framework of eight malleable factors associated with REDD and describe the Disproportionality in Discipline Assessment for Schools (DDAS). The DDAS is a suite of user‐friendly tools based on this framework, designed to help school teams identify and address REDD. Two studies are described. Study 1 presents the results of educator feedback on a presentation of the framework and the DDAS in terms of their perception of its feasibility, usability, and validity/logical soundness. Study 2 presents the process of applying the DDAS in four real‐world school settings. Results indicated that the framework and the DDAS were useful and feasible tools to help schools increase equity and address REDD. Modifications to the framework and the DDAS were made to improve validity and appropriateness.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.