There is much interest in comparing latent traits, such as teacher job satisfaction, in large international surveys. However, different countries respond to questionnaires in different languages and interpret the questions through different cultural lenses, raising doubts about the psychometric equivalence of the measurements. Making valid comparisons depends on the latent traits displaying scalar measurement invariance. Unfortunately, this condition is rarely met across many countries at once. Different approaches that maximize the utility of such surveys, but remain faithful to the principles of measurement invariance testing, are therefore needed. This article illustrates one such approach, involving multiple‐pairwise comparisons. This enables us to compare teacher job satisfaction in England to 17 of the countries that participated in TALIS 2013. Teacher job satisfaction in England was as low, or lower, than all of the 17 comparable countries.
The purpose of large-scale international assessments is to compare educational achievement across countries. For such cross-national comparisons to be meaningful, the participating students must be representative of the target population. In this paper, we consider whether this is the case for Canada, a country widely recognised as high performing in the Programme for International Student Assessment (PISA). Our analysis illustrates how the PISA 2015 sample for Canada only covers around half of the 15-year-old population, compared to over 90% in countries like Finland, Estonia, Japan and South Korea. We discuss how this emerges from differences in how children with special educational needs are defined and rules for their inclusion in the study, variation in school participation rates and the comparatively high rates of pupils' absence in Canada during the PISA study. The paper concludes by investigating how Canada's PISA 2015 rank would change under different assumptions about how the non-participating students would have performed were they to have taken the PISA test.
This study was an examination of the immediate effects of remote learning during the COVID-19 shutdown in New Jersey during Spring 2020. This mixed methods study relied on survey data capturing the experiences, difficulties, and successes of 708 New Jersey public school educators during the first few weeks of the school closures. These educators were teachers, administrators, school librarians, and other school personnel. The disruptions of COVID-19 will leave indelible changes on education in New Jersey and beyond, and this research examines the beginning of these changes. The findings indicate that while educators found support from their administration, they also encountered a spectrum of difficulties relating to the absence of face-to-face contact with students, in addition to success in coping with the situation as well as some success that surpassed their experiences of schooling before the shutdown.
HighlightsPISA data is widely used as a basis for education policymaking. Yet, few people realise that students background characteristics are used in the creation of PISA scores, and why this is done. This is at least partly due to the fact that the methodology used in PISA is complex and opaquely communicated. In this paper we first replicated how PISA scores are created to demonstrate this process. We then systematically alter how the background variables (such as student characteristics) are used in the computation of PISA scores to investigate how this affects the results.While countries' mean achievement is robust for the major domain, different specifications in how PISA scores are generated were found to lead to important changes for one of the minor domains (reading). This sensitivity of PISA results to the precise methodology used is even more pronounced when we look at measures of inequality. Changes to how
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.