Learning Analytics focuses on the collection and analysis of learners' data to improve their learning experience by providing informed guidance and to optimise learning materials. To support the research in this area we have developed a dataset, containing data from courses presented at the Open University (OU). What makes the dataset unique is the fact that it contains demographic data together with aggregated clickstream data of students' interactions in the Virtual Learning Environment (VLE). This enables the analysis of student behaviour, represented by their actions. The dataset contains the information about 22 courses, 32,593 students, their assessment results, and logs of their interactions with the VLE represented by daily summaries of student clicks (10,655,280 entries
By collecting longitudinal learner and learning data from a range of resources, predictive learning analytics (PLA) are used to identify learners who may not complete a course, typically described as being at risk. Mixed effects are observed as to how teachers perceive, use, and interpret PLA data, necessitating further research in this direction. The aim of this study is to evaluate whether providing teachers in a distance learning higher education institution with PLA data predicts students' performance and empowers teachers to identify and assist students at risk. Using principles of Technology Acceptance and Academic Resistance models, a university-wide, multi-methods study with 59 teachers, nine courses, and 1325 students revealed that teachers can positively affect students' performance when engaged with PLA. Follow-up semi-structured interviews illuminated teachers' actual uses of the predictive data and revealed its impact on teaching practices and intervention strategies to support students at risk.
A vast number of studies, yet mostly small-scale reported exciting innovations and practices in the field of learning analytics. Whilst these studies provide substantial insights, there are still relatively few studies that have explored how the stakeholders' (i.e., teachers, students, researchers, management) perspectives and involvement influence largescale and institutional-wide adaptation of learning analytics. This study reports on one such large-scale and long-term implementation of Predictive Learning Analytics (PLA) spanning a period of four years at a distance learning university. OU Analyse (OUA) is the PLA system used in this study, providing predictive insights to teachers about students and their chance of passing a course. Over the last four years, OUA has been accessed by 1,182 unique teachers and reached 23,640 students in 231 undergraduate online courses. The aim of this study is twofold: (a) to reflect on the macro-level of adoption by detailing usage, challenges and factors facilitating adoption at the organisational level, and (b) to detail the micro-level of adoption, that is the teachers' perspectives about OUA. Amongst the factors critical to the scalable PLA implementation were: the faculty's engagement with OUA, teachers as "champions", evidence generation and dissemination, digital literacy, and conceptions about teaching online.
Substantial progress in learning analytics research has been made in recent years to predict which groups of learners are at-risk. In this chapter we argue that the largest challenge for learning analytics research and practice still lies ahead of us: using learning analytics modelling, which types of interventions have a positive impact on learners' Attitudes, Behaviour and Cognition (ABC). Two embedded case-studies in social science and science are discussed, whereby notions of evidence-based research are illustrated by scenarios (quasiexperimental, A/B-testing, RCT) to evaluate the impact of interventions. Finally, we discuss how a Learning Analytics Intervention and Evaluation Framework (LA-IEF) is currently being implemented at the Open University UK using principles of design-based research and evidence-based research.
This study presents an advanced predictive learning analytics system, OU Analyse (OUA), and evidence from its evaluation with online teachers at a distance learning university. OUA is a predictive system that uses machine learning methods for the early identification of students at risk of not submitting (or failing) their next assignment. Teachers have access, via interactive dashboards, to weekly predictions of risk of failing for each of their students. In this study, we examined how the degree of OUA usage by 559 teachers, of which 189 were given access to OUA, related to student learning outcomes of more than 14 000 students in 15 undergraduate courses. Teachers who made “average” use of OUA, that is accessed OUA throughout the life cycle of a course presentation, and in particular between 10% and 40% of the weeks a course was running, and intervened with students flagged as at risk were found to benefit their students the most; after controlling for differences in academic performance, these students were found to have significantly better performance than their peers in the previous year's course presentation during which the same teachers made no use of predictive learning analytics. Predictive learning analytics is an innovative student's support approach in online pedagogy that, as shown in this study, can empower online teachers in effectively monitoring and intervening with their students, over and above other approaches, and result in improved learning outcomes. What is already known about this topic Pedagogical and personal support to students is a significant responsibility of online teachers. Student's support is a challenging activity due to the lack of face‐to‐face interactions. Predictive learning analytics (PLA) can identify students at risk of failing their studies. What this paper adds One of the few large‐scale studies is available for examining the impact of analytics on student's performance. Teachers' usage of PLA was significantly related to better learning outcomes. Online teachers had students with better learning outcomes when accessing PLA data rather than when they had no access. Implications for practice and/or policy PLA can empower online teachers and complement the teaching practice. PLA can help in the identification and proactive intervention of students at risk of failing their studies. Actions should be taken to motivate and engage online teachers with PLA.
This paper focuses on the problem of identifying students, who are at risk of failing their course. The presented method proposes a solution in the absence of data from previous courses, which are usually used for training machine learning models. This situation typically occurs in new courses. We present the concept of a "self-learner" that builds the machine learning models from the data generated during the current course. The approach utilises information about already submitted assessments, which introduces the problem of imbalanced data for training and testing the classification models.There are three main contributions of this paper: (1) the concept of training the models for identifying at-risk students using data from the current course, (2) specifying the problem as a classification task, and (3) tackling the challenge of imbalanced data, which appears both in training and testing data.The results show the comparison with the traditional approach of learning the models from the legacy course data, validating the proposed concept.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.