The Ontario Human Rights Commission’s (OHCR) Right to Read Report calls for school districts to implement early literacy interventions that have been scientifically proven to be effective for young children with reading difficulties. The acknowledgment of early intervention as an essential service for young children experiencing reading difficulties is a strong and welcome message in the report. However, the report recommends a narrow course for reading interventions in Ontario, drawing on discourse from the Science of Reading community, which questionably frames current interventions, such as Reading Recovery, as unscientific, ineffective commercial programs. In this response, the authors contest the one-sidedness of these recommendations based on a paradox in the report between what constitutes an effective early literacy intervention supported by science and the standards for effectiveness the OHRC requires of interventions it endorses versus those it discredits. Rather than dismissing one approach or the other outright, a call is made for school leadership to consider broader reading science and the strengths of various approaches instead of narrowing the menu of effective literacy interventions that may support diverse learners.
uring the past decade institutions of higher education have been in-D volved in establishing assessment programs, choosing appropriate instruments, discussing costs and benefits, and finding ways to convince faculty to participate in assessment efforts. But little attention has been given in the literature or in presentations at national meetings to describing strategies for facilitating the use of this information for improvement of programs and services at institutions. Additionally, some state programs such as the performance funding policy in Tennessee now require that institutions provide evidence of how they are using assessment results for improvement. In the past, the results have been used most frequently to convince the public and lawmakers that higher education is accountable and effective. Now educational institutions must demonstrate to a broader audience that efforts are being made to provide better services and higher-quality academic programs and to operate more efficiently and effectively. The users of assessment results must be able to interpret and use findings before these improvements can be effected.Too often institutions collect assessment data, report the results to governing boards or other publics, and then file this valuable information away. It would be interesting to explore the reasons why many faculty, staff, and administrators in higher education have been slow to rely on data for decision making concerning the curriculum, instruction, programs, and services. Statistics can be confusing, and educators often prefer to rely on the traditions of academe, which have worked well in the past. However, having valid data for decision making depends on developing a well-planned program of assessment. To ensure that assessment results provide information useful for planning improvements, it is essential to review new and established programs to determine that appropriate processes and procedures for gathering valid and reliable data are in place.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.