Proceedings of the 2014 ACM Workshop on Multimodal Learning Analytics Workshop and Grand Challenge 2014
DOI: 10.1145/2666633.2666636
|View full text |Cite
|
Sign up to set email alerts
|

Holistic Analysis of the Classroom

Abstract: Should we judge the quality of the class by the grades the students and teacher get at the end of the semester or how the group collaborated during the semester towards acquiring new knowledge? Up until recently, the later approach was all too inaccessible due to complexity and time needed to evaluate every class. With the development of new technologies in different branches of video processing, gaze tracking and audio analysis we are getting the opportunity to go further with our analysis and go around the p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 23 publications
(11 citation statements)
references
References 23 publications
(22 reference statements)
0
11
0
Order By: Relevance
“…The movements of the torso can provide GBM, which is typically derived from video cameras. GBM was used by Raca and Dillenbourg () in their study for assessing students' attention from their body posture, gesturing, and other cues. Similarly, Bosch et al () used GBM to detect learners' emotions in combination with facial expression and learning activity.…”
Section: Literature Surveymentioning
confidence: 99%
See 3 more Smart Citations
“…The movements of the torso can provide GBM, which is typically derived from video cameras. GBM was used by Raca and Dillenbourg () in their study for assessing students' attention from their body posture, gesturing, and other cues. Similarly, Bosch et al () used GBM to detect learners' emotions in combination with facial expression and learning activity.…”
Section: Literature Surveymentioning
confidence: 99%
“…Facial expressions are highly investigated in learning for emotion recognition in the affective computing research and have been quite extensively used in multimodal human–computer interaction experiments (e.g., Alyuz et al, ; Bosch et al, ; Hussain et al, ). Eye‐tracking is commonly used as an indicator for learners' attention has also been used with multimodal data sets (Edwards et al, ; Prieto, Sharma, Dillenbourg, & Rodríguez‐Triana, ; Raca & Dillenbourg, ). Finally, an analysis of the speech spans from paralanguage analysis like speaking time, keywords pronounced, or prosodic features like tone and pitch (e.g., Prieto et al, ) to actual recognition of spoken words in dialogic settings like student–teacher interactions (D'mello et al, ).…”
Section: Literature Surveymentioning
confidence: 99%
See 2 more Smart Citations
“…Sensors can monitor both the physical environment where the learner operates and the learner's behaviour including 360-degrees body movements, physiological responses such as heart rate or body temperature, or interpersonal communication, student-teacher or student-peers discussions. In related literature, sensors have been used to track different modalities such as hand gestures [24,40], gross body movements [7], eyetracking [28,29], facial expressions [3,4]. Sensors were also used to monitor physiological signals such as heart rate [1,16], galvanic skin response [14,25], brain waves [1,28].…”
Section: Sensors In Learningmentioning
confidence: 99%