Massive Online Open Courses (MOOCs) are growing substantially in numbers, and also in interest from the educational community. MOOCs offer particular challenges for what is becoming accepted as mainstream practice in learning analytics.Partly for this reason, and partly because of the relative newness of MOOCs as a widespread phenomenon, there is not yet a substantial body of literature on the learning analytics of MOOCs. However, one clear finding is that drop-out/non-completion rates are substantially higher than in more traditional education. This paper explores these issues, and introduces the metaphor of a 'funnel of participation' to reconceptualise the steep drop-off in activity, and the pattern of steeply unequal participation, which appear to be characteristic of MOOCs and similar learning environments. Empirical data to support this funnel of participation are presented from three online learning sites: iSpot (observations of nature), Cloudworks ('a place to share, find and discuss learning and teaching ideas and experiences'), and openED 2.0, a MOOC on business and management that ran between 2010-2012. Implications of the funnel for MOOCs, formal education, and learning analytics practice are discussed.
Learning analytics, the analysis and representation of data about learners in order to improve learning, is a new lens through which teachers can understand education. It is rooted in the dramatic increase in the quantity of data about learners, and linked to management approaches that focus on quantitative metrics, which are sometimes antithetical to an educational sense of teaching.However, learning analytics offers new routes for teachers to understand their students, and hence to make effective use of their limited resources. This paper explores these issues, and describes a series of examples of learning analytics to illustrate the potential. It argues that teachers can and should engage with learning analytics as a way of influencing the metrics agenda towards richer conceptions of learning, and to improve their teaching. Keywords: learning analytics; analytics; metrics IntroductionThere is a tension between the framing of education as an economic activity and conceptions of education and learning that are concerned with the development of meaning and the transformation of understanding. These difficulties are far from purely theoretical concerns: they increasingly have very practical, concrete consequences for teachers and learners, notably around resource constraints, class sizes, and time pressures. Within this constrained environment, teachers are subject to accountability processes based on and enabled by the deployment of quantitative metrics of their practices.Quantitative metrics are increasingly used not only because of theoretical framings that support them, but also because of a substantial and dramatic change in their practicability over the last ten or twenty years. This change is often referred to as Big Data: the quantity, range and scale of data that can be and is gathered has increased exponentially (or close to exponentially). Accompanying this explosion of data is a 3 series of rapid advances in computational techniques for managing, processing and analysing these large volumes of data in ways that are actionable. These developments are transforming enquiry. The scale of data is greatest in science -for instance, the Large Hadron Collider at CERN produced 23 petabytes (23 million gigabytes) of information in 2011 (CERN, 2012). The effect is not restricted to science -for instance, the ability to manage and integrate textual and geographic data is changing scholarly practice in the classics (see e.g. Project HESTIA: the Herodotus Encoded Space-TextImaging Archive, http://www.open.ac.uk/Arts/hestia/index.html). New approaches become possible: for instance, rather than sampling, an entire population can be captured. The volume and scope of data can be so large that it is possible to start with a dataset and apply computational methods to produce results, and only subsequently to seek an interpretation or meaning.Big data is by no means restricted to the academy. Technology companies such as Google and Facebook make managing staggeringly large datasets their core business, but even companies...
ABSTRACT:A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and careful consideration of the entire TEL technology complex: the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. It is crucial not only to provide analytics and their associated tools, but also to begin with a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building. In this paper, we offer tools and case studies that will support educational institutions in deploying learning analytics at scale with the goal of achieving specified learning and teaching objectives. The ROMA Framework offers a step-by-step approach to the institutional implementation of learning analytics and this approach is grounded here by case studies of practice from the UK and Australia
This paper develops Campbell and Oblinger's [4] five-step model of learning analytics (Capture, Report, Predict, Act, Refine) and other theorisations of the field, and draws on broader educational theory (including Kolb and Schön) to articulate an incrementally more developed, explicit and theoretically-grounded Learning Analytics Cycle.This cycle conceptualises successful learning analytics work as four linked steps: learners (1) generating data (2) that is used to produce metrics, analytics or visualisations (3). The key step is 'closing the loop' by feeding back this product to learners through one or more interventions (4).This paper seeks to begin to place learning analytics practice on a base of established learning theory, and draws several implications from this theory for the improvement of learning analytics projects. These include speeding up or shortening the cycle so feedback happens more quickly, and widening the audience for feedback (in particular, considering learners and teachers as audiences for analytics) so that it can have a larger impact.
Abstract. Massive open online courses (MOOCs) are part of the lifelong learning experience of people worldwide. Many of these learners participate fully. However, the high levels of dropout on most of these courses are a cause for concern. Previous studies have suggested that there are patterns of engagement within MOOCs that vary according to the pedagogy employed. The current paper builds on this work and examines MOOCs from different providers that have been offered on the FutureLearn platform. A cluster analysis of these MOOCs shows that engagement patterns are related to pedagogy and course duration. Learners did not work through a three-week MOOC in the same ways that learners work through the first three weeks of an eight-week MOOC.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.