To ensure the validity of an assessment programme, it is essential to align it with the intended learning outcomes (LO). We present a model for ensuring assessment validity which supports this constructive alignment and uses learning analytics (LA). The model is based on LA that include a comparison between ideal LO weights (expressing the prioritization of LOs), actual assessment weights (maximum assessment points per LO), and student assessment results (actually obtained assessment points per LO), as well as clustering and trace data analysis. These analytics are part of a continuous improvement cycle, including strategic planning and learning design (LD) supported by LO prioritization, and monitoring and evaluation supported by LA. To illustrate and test the model, we conducted a study on the example of a graduate‐level higher education course in applied mathematics, by analysing student assessment results and activity in a learning management system. The study showed that the analyses provided valuable insights with practical implications for the development of sound LD, tailored educational interventions, databases of assessment tasks, recommendation systems, and self‐regulated learning. Future research should investigate the possibilities for automation of such LA, to enable full exploitation of their potential and use in everyday teaching and learning. Practitioner notesWhat is already known about this topic To develop sound, student‐centred learning design (LD), it is essential to ensure that assessment is constructively aligned with the intended learning outcomes (LO). This constructive alignment is crucial for ensuring the validity of an assessment program. Learning analytics (LA) can provide insights that help develop valid assessment programs. What this paper adds As not all LOs are equally important, assessment programs should reflect the prioritization of LOs, which can be determined by using various multi‐criteria decision‐making (MCDM) methods. This article presents and illustrates, based on an empirical case, a model of continuous improvement of LD, which uses LA to compare how LOs are reflected in (actual) students' results, in an (actual) assessment program, and in the (ideal) prioritization of LOs based on MCDM. The study presents how clustering of students based on their assessment results can be used in LA to provide insights for educational interventions better targeted to students' needs. Implications for practice and/or policy The proposed LA can provide important insights for the development (or improvement) of LD in line with the intended course LOs, but also study program LOs (if course and study program LOs are properly aligned). The LA can also contribute to the development of databases of assessment tasks aligned with course LOs, with ensured validity, supporting sharing and reusing, as well as to the development of tailored educational interventions (eg, based on clustering). The proposed LA can also contribute to the development of recommendation systems, with recommendations for the improvement of LD for teachers or learning suggestions for students, as well as students' meta‐cognition and self‐regulated learning.
This paper presents a pilot research of approaches taken by European Union (EU) Member States (MSs) in the emergency response to the COVID-19 crisis in pre-tertiary education. It includes a multi-case study of four MSs, examining their education systems’ digital readiness and “fitness for change”, as well as decision-making practices. We use the Cynefin framework of decision contexts to explore different paths in shifting from chaos to complexity. The preliminary findings of this research indicate that a factor positively influencing education systems’ response was not exclusively their digital readiness, but also “fitness” in terms of ongoing reforms and preparedness for change. The findings also suggest that MSs generally continued with their usual decision-making practices, with tendencies towards centralisation of crucial decisions, which is in line with the Cynefin framework, arguing for stronger leadership in chaotic contexts.
Professional development (PD) is a key element for enhancing the quality of academic teaching. An increasing number of PD activities have moved to blended and online formats, especially since the COVID-19 pandemic. Due to the desire, potential, and need for collaboration among educators to learn from innovative and best practices, several institutions have started to pool their resources and expertise together and have started to implement cross-institutional and cross-national online professional development (OPD). The questions of what type of a (cross-)institutional OPD educators might prefer, and whether educators learn effectively from (and with) peers in such cross-cultural context have not been adequately explored empirically. In this case-study across three European countries, we explored the lived experiences of 86 educators as a result of a cross-institutional OPD. Using a mixed methods design approach our pre-post findings indicated that, on average, participants made substantial gains in knowledge. In addition, several cultural differences were evident in the expectations and lived experiences in ODP, as well as the intention to transfer what had been learned into one's own practice of action. This study indicates that while substantial economic and pedagogical affordances are provided with cross-institutional OPD, cultural differences in context might impact the extent to which educators implement lessons learned from OPD.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.