2005
DOI: 10.1080/02796015.2005.12086277
|View full text |Cite
|
Sign up to set email alerts
|

Treatment Implementation Following Behavioral Consultation in Schools: A Comparison of Three Follow-up Strategies

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

7
205
2

Year Published

2009
2009
2020
2020

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 342 publications
(215 citation statements)
references
References 35 publications
7
205
2
Order By: Relevance
“…It is possible that the completion of permanent products occurred after intervention implementation (e.g., teacher was not observed announcing criteria for winning, but criteria for winning was written on the poster when permanent products were coded). Estimates of treatment fidelity via self-report have been criticized for being inflated (e.g., Noell et al, 2005); permanent products may be susceptible to the same problem.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…It is possible that the completion of permanent products occurred after intervention implementation (e.g., teacher was not observed announcing criteria for winning, but criteria for winning was written on the poster when permanent products were coded). Estimates of treatment fidelity via self-report have been criticized for being inflated (e.g., Noell et al, 2005); permanent products may be susceptible to the same problem.…”
Section: Discussionmentioning
confidence: 99%
“…For direct observation, fewer sessions produced a reliable estimate of GBG implementation, which led the authors to endorse this assessment method (Gresham et al, 2017). Self-report has been criticized due to implementers’ tendency to inflate their treatment fidelity ratings (e.g., Noell et al, 2005), although this is not always the case (Sanetti & Kratochwill, 2009b). In comparing treatment fidelity assessments of behavior support plan implementation per direct observation and permanent product, Sanetti and Collier-Meek (2014) found that data collected using direct observation (a) reflected intervention steps more comprehensively, (b) could include an assessment of implementation quality, and (c) were more highly correlated with student outcomes.…”
Section: Treatment Fidelity Assessmentmentioning
confidence: 99%
See 1 more Smart Citation
“…School-based interventions must be implemented sufficiently to impact student outcomes (see DiGennaro Reed, & Codding, 2013; Sanetti & Kratochwill, 2009). Low levels of treatment integrity, indicative of inconsistent implementation, have been documented across interventions designed to address academic and behavior concerns for students with disabilities and in general education at the individual student, classroom, and school-wide level (e.g., Bradshaw, Mitchell, & Leaf, 2010; Noell et al, 2005; Sanetti, Collier-Meek, Long, Byron, & Kratochwill, 2015). To investigate why low levels of treatment integrity are so pervasive, researchers have begun to evaluate variables that may influence implementation (e.g., Domitrovich et al, 2008; Durlak & DuPre, 2008; Long et al, 2016).…”
mentioning
confidence: 99%
“…PFB has been shown to be effective for improving the implementation of behavioral supports in instructional packages (Gilbertson, Witt, Singletary, & VanDerHeyden, 2007) as well as decreasing rates of undesired behavior (DiGennaro, Martens, & Kleinmann, 2007; Jones, Wickstrom, & Friman, 1997). PFB methods as described in Noell et al (2005) consisted of providing frequent feedback regarding the specific components implemented correctly as well as the steps not implemented correctly. The available research suggests that PFB improves overall adherence to intervention protocols and is applicable to a wide range of consumers and applications (Solomon, Klein, & Polityo, 2012), though its applicability to peer-mediated SST has not yet been explored.…”
mentioning
confidence: 99%