Abstract:ABSTRACT:In this paper, we investigate the correspondence between student affect and behavioural engagement in a web-based tutoring platform throughout the school year and learning outcomes at the end of the year on a high-stakes mathematics exam in a manner that is both longitudinal and fine-grained. Affect and behaviour detectors are used to estimate student affective states and behaviour based on post-hoc analysis of tutor log-data. For every student action in the tutor, the detectors give us an estimated p… Show more
“…Baker, D'Mello, Rodrigo, & Graesser (2010) and Pardos, Baker, San Pedro, Gowda, & Gowda (2013) are examples of work using human annotated affective states. In Pardos et al (2013), the researchers used the Baker-Rodrigo Observation Method Protocol (BROMP) (Ocumpaugh, Baker, & Rodrigo, 2012) to correlate student behaviour and affect while participating in cognitive tutoring activities with performance on standardized tests.…”
Section: Human Annotated Affective Statesmentioning
confidence: 99%
“…In Pardos et al (2013), the researchers used the Baker-Rodrigo Observation Method Protocol (BROMP) (Ocumpaugh, Baker, & Rodrigo, 2012) to correlate student behaviour and affect while participating in cognitive tutoring activities with performance on standardized tests. They found that the learning gains associated with certain affective states, namely boredom and confusion, are highly dependent on the level of scaffolding that the student is receiving.…”
Section: Human Annotated Affective Statesmentioning
(2016). Multimodal learning analytics and education data mining: Using computational technologies to measure complex learning tasks. Journal of Learning Analytics, 3(2), 220-238. http://dx.doi.org/10.18608/jla.2016220-238. http://dx.doi.org/10.18608/jla. .32.11 ISSN 1929
Marcelo Worsley Learning Sciences & Computer ScienceNorthwestern University, USA marcelo.worsley@northwestern.edu ABSTRACT: New high-frequency multimodal data collection technologies and machine learning analysis techniques could offer new insights into learning, especially when students have the opportunity to generate unique, personalized artifacts, such as computer programs, robots, and solutions engineering challenges. To date most of the work on learning analytics and educational data mining has been focused on online courses and cognitive tutors, both of which provide a high degree of structure to the tasks, and are restricted to interactions that occur in front of a computer screen. In this paper, we argue that multimodal learning analytics can offer new insights into student learning trajectories in more complex and open-ended learning environments. We present several examples of this work and its educational applications.
Keywords: Learning analytics, multimodal interaction, constructivism, constructionism, assessment
INTRODUCTIONThe same battle is fought in every field of educational research and practice: the champions of the direct instruction of well-defined content pitted against those who encourage student-centred exploration of ill-defined domains. These wars have taken place repeatedly over past decades, and partisans on each side have been reborn in multiple incarnations. The first tradition tends to be aligned with behaviourist or neo-behaviourist approaches, while the second favours constructivist-inspired pedagogies. In language arts, the battle has been between phonics and the whole word approach. In math, war is wagged between teaching algorithms versus instruction in how to think mathematically. In history, they clash over the relative merits of critical interpretations and the memorization of historical facts. In science, they clash about inquiry-based approaches versus direct instruction of formulas and principles.
“…Baker, D'Mello, Rodrigo, & Graesser (2010) and Pardos, Baker, San Pedro, Gowda, & Gowda (2013) are examples of work using human annotated affective states. In Pardos et al (2013), the researchers used the Baker-Rodrigo Observation Method Protocol (BROMP) (Ocumpaugh, Baker, & Rodrigo, 2012) to correlate student behaviour and affect while participating in cognitive tutoring activities with performance on standardized tests.…”
Section: Human Annotated Affective Statesmentioning
confidence: 99%
“…In Pardos et al (2013), the researchers used the Baker-Rodrigo Observation Method Protocol (BROMP) (Ocumpaugh, Baker, & Rodrigo, 2012) to correlate student behaviour and affect while participating in cognitive tutoring activities with performance on standardized tests. They found that the learning gains associated with certain affective states, namely boredom and confusion, are highly dependent on the level of scaffolding that the student is receiving.…”
Section: Human Annotated Affective Statesmentioning
(2016). Multimodal learning analytics and education data mining: Using computational technologies to measure complex learning tasks. Journal of Learning Analytics, 3(2), 220-238. http://dx.doi.org/10.18608/jla.2016220-238. http://dx.doi.org/10.18608/jla. .32.11 ISSN 1929
Marcelo Worsley Learning Sciences & Computer ScienceNorthwestern University, USA marcelo.worsley@northwestern.edu ABSTRACT: New high-frequency multimodal data collection technologies and machine learning analysis techniques could offer new insights into learning, especially when students have the opportunity to generate unique, personalized artifacts, such as computer programs, robots, and solutions engineering challenges. To date most of the work on learning analytics and educational data mining has been focused on online courses and cognitive tutors, both of which provide a high degree of structure to the tasks, and are restricted to interactions that occur in front of a computer screen. In this paper, we argue that multimodal learning analytics can offer new insights into student learning trajectories in more complex and open-ended learning environments. We present several examples of this work and its educational applications.
Keywords: Learning analytics, multimodal interaction, constructivism, constructionism, assessment
INTRODUCTIONThe same battle is fought in every field of educational research and practice: the champions of the direct instruction of well-defined content pitted against those who encourage student-centred exploration of ill-defined domains. These wars have taken place repeatedly over past decades, and partisans on each side have been reborn in multiple incarnations. The first tradition tends to be aligned with behaviourist or neo-behaviourist approaches, while the second favours constructivist-inspired pedagogies. In language arts, the battle has been between phonics and the whole word approach. In math, war is wagged between teaching algorithms versus instruction in how to think mathematically. In history, they clash over the relative merits of critical interpretations and the memorization of historical facts. In science, they clash about inquiry-based approaches versus direct instruction of formulas and principles.
“…Craig and colleagues [16] investigated the relationships between learning gains and affect state and found that confusion and flow were positively associated with learning gains but boredom was negatively associated with learning. Pardos and colleagues [30] also found that affect in intelligent tutors can predict not just local learning, but longer-term learning outcomes (state standardized exam scores) as well, specifically finding that boredom is negatively associated with longer-term learning outcomes while engaged concentration (e.g. flow) and frustration were positively associated with learning gains.…”
Section: Introductionmentioning
confidence: 97%
“…This work has generally found that a range of disengaged behaviors are associated with negative learning outcomes, including both gaming the system and off-task behavior [cf. 1,15,30].…”
Abstract.Recent research has shown that differences in software design and content are associated with differences in how much students game the system and go off-task. In particular the design features of a tutor have found to predict substantial amounts of variance in gaming and off-task behavior. However, it is not yet understood how this influence takes place. In this paper we investigate the relationship between a student's affective state, their tendency to engage in disengaged behavior, and the design aspects of the learning environments, towards understanding the role that affect plays in this process. To investigate this question, we integrate an existing taxonomy of the features of tutor lessons [3] with automated detectors of affect [8]. We find that confusion and frustration are significantly associated with lesson features which were found to be associated with disengaged behavior in past research. At the same time, we find that the affective state of engaged concentration is significantly associated with features associated with lower frequencies of disengaged behavior. This analysis suggests that simple re-designs of tutors along these lines may lead to both better affect and less disengaged behavior.
“…For example, data have been leveraged to increase student retention by creating an early-warning system to allow faculty to notify students who may be at risk of failing a particular course 19 . Also, data have been used to understand differences across students in online learning strategies to allow course designers to build a more personalized experience for different subgroups of learners 20,21 . With an increase in the number of available data sources, colleges and universities have a great opportunity to explore how data can shape, enhance, and direct learning at all levels.…”
Learning management systems (LMS) are ubiquitous among colleges and universities worldwide, however they are thought of as a transactional warehouse rather than an opportunity to understand student learning outside of the classroom. Each LMS is used differently across campuses, and even across sections of the same courses taught by different instructors. For instance, one instructor might utilize a gradebook feature which allows students to view their assignment grades, while another instructor in the same course might use a different method to distribute grades for assignments. But is there a differential relationship between tool usage and student engagement or performance in the course across these sections? Our research addresses this issue and seeks to understand the nature of how LMS tools are used by students and how the use of those tools may shed insight on student learning or engagement.We ground our work theoretically using the Academic Plan Model to understand how freshman engineering students' use of LMS tools relate to their performance in the class. The Academic Plan Model details potential influences on curriculum design at the course, program, and institutional levels. As the Model suggests, faculty members may (or should) consider learners, instructional resources, and instructional processes when developing their curricular plans. Prior research within and outside engineering, however, has shown that faculty tend not to draw on available data when considering these components, if they even consider them at all. Our study presents an idea for bringing data into those considerations by focusing on the course-level activities of students within an LMS. We empirically describe an opportunity educators have to understand what can be learned from investigating LMS student data. Specifically, our data set consists of student LMS log files (approximately 15 million rows) for one engineering course (36 sections and 876 students) from Fall 2013.Results show clear patterns of student engagement with different LMS tools across final grades within first year engineering courses. Additionally, certain tool usage relates more strongly with course performance. By understanding how and when students use those tools in particular, faculty members may be able to create more data-informed course plans and provide empirically driven feedback to students on their levels of engagement in the class.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.