This article addresses a relatively unexplored area in the emerging field of learning analytics, the design of learning analytics interventions. A learning analytics intervention is defined as the surrounding frame of activity through which analytic tools, data, and reports are taken up and used. It is a soft technology that involves the orchestration of the human process of engaging with the analytics as part of the larger teaching and learning activity. This paper first makes the case for the overall importance of intervention design, situating it within the larger landscape of the learning analytics field, and then considers the specific issues of intervention design for student use of learning analytics. Four principles of pedagogical learning analytics intervention design that can be used by teachers and course developers to support the productive use of learning analytics by students are introduced: Integration, Agency, Reference Frame and Dialogue. In addition three core processes in which to engage students are described: Grounding, Goal-Setting and Reflection. These principles and processes are united in a preliminary model of pedagogical learning analytics intervention design for students, presented as a starting point for further inquiry.
The process of using analytic data to inform instructional decision-making is acknowledged to be complex; however, details of how it occurs in authentic teaching contexts have not been fully unpacked. This study investigated five university instructors’ use of a learning analytics dashboard to inform their teaching. The existing literature was synthesized to create a template for inquiry that guided interviews, and inductive qualitative analysis was used to identify salient emergent themes in how instructors 1) asked questions, 2) interpreted data, 3) took action, and 4) checked impact. Findings showed that instructors did not always come to analytics use with specific questions, but rather with general areas of curiosity. Questions additionally emerged and were refined through interaction with the analytics. Data interpretation involved two distinct activities, often along with affective reactions to data: reading data toidentify noteworthy patterns and explaining their importance in the course using contextual knowledge. Pedagogical responses to the analytics included whole-class scaffolding, targeted scaffolding, and revising course design, as well two new non-action responses: adopting a wait-and-see posture and engaging in deep reflection on pedagogy. Findings were synthesized into a model of instructor analytics use that offers useful categories of activities for future study and support
This paper describes an application of learning analytics that builds on an existing research program investigating how students contribute and attend to the messages of others in asynchronous online discussions. We first overview the E-Listening research program and then explain how this work was translated into analytics that students and instructors could use to reflect on their discussion participation. Two kinds of analytics were designed: some embedded in the learning environment to provide students with real-time information on their activity in-progress; and some extracted from the learning environment and presented to students in a separate digital space for reflection. In addition, we describe the design of an intervention though which use of the analytics can be introduced as an integral course activity. Findings from an initial implementation of the application indicated that the learning analytics intervention supported changes in students' discussion participation. Five issues for future work on learning analytics in online discussions are presented. One, unintentional versus purposeful change; two, differing changes prompted by the same analytic; three, importance of theoretical buy-in and calculation transparency for perceived analytic value; four, affective components of students' reactions; and five, support for students in the process of enacting analytics-driven changes.
ABSTRACT:It is an exhilarating and important time for conducting research on learning, with unprecedented quantities of data available. There is a danger, however, in thinking that with enough data, the numbers speak for themselves. In fact, with larger amounts of data, theory plays an ever-more critical role in analysis. In this introduction to the special section on learning analytics and learning theory, we describe some critical problems in the analysis of large-scale data that occur when theory is not involved. These questions revolve around what variables a researcher should attend to and how to interpret a multitude of micro-results and make them actionable. We conclude our comments with a discussion of how the collection of empirical papers included in the special section, and the commentaries that were invited on them, speak to these challenges, and in doing so represent important steps towards theory-informed and theory-contributing learning analytics work. Our ultimate goal is to provoke a critical dialogue in the field about the ways in which learning analytics research draws on and contributes to theory.
With the implementation of competency-based medical education (CBME) in emergency medicine, residency programs will amass substantial amounts of qualitative and quantitative data about trainees' performances. This increased volume of data will challenge traditional processes for assessing trainees and remediating training deficiencies. At the intersection of trainee performance data and statistical modeling lies the field of medical learning analytics. At a local training program level, learning analytics has the potential to assist program directors and competency committees with interpreting assessment data to inform decision making. On a broader level, learning analytics can be used to explore system questions and identify problems that may impact our educational programs. Scholars outside of health professions education have been exploring the use of learning analytics for years and their theories and applications have the potential to inform our implementation of CBME. The purpose of this review is to characterize the methodologies of learning analytics and explore their potential to guide new forms of assessment within medical education.
This research experimentally manipulated the social presence cues in instructor's messages to students. The context was an online professional development one-credit course with one-to-one mentoring of students. Additionally, student learning intentions and levels of trust were examined as factors that may mitigate the effects of social presence. Results indicate that social presence affects the learner's interactions and perception of the instructor but has no effect on perceived learning, satisfaction, engagement, or the quality of their final course product. These findings suggest social presence is a correlational rather than a causal variable associated with student learning. Exploratory analyses suggest that trust and learning intentions are potentially important factors impacting student perceptions of the learning environment and performance in the course respectively. Distance education has a long and venerable history in providing educational opportunities to those who cannot come to a campus. Recently, the emergence of
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.