2023
DOI: 10.1109/access.2023.3325407
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing Facial Expression Recognition System in Online Learning Context Using Efficient Deep Learning Model

Mohammed Aly,
Abdullatif Ghallab,
Islam S. Fathi

Abstract: This paper presents an online educational platform that leverages facial expression recognition technology to monitor students' progress within the classroom. Periodically, a camera captures images of students in the classroom, processes these images, and extracts facial data through detection methods. Subsequently, students' learning statuses are assessed using expression recognition techniques. The developed approach then dynamically refines and enhances teaching strategies using the acquired learning status… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 45 publications
0
2
0
Order By: Relevance
“…This capability is crucial in education for several reasons. Firstly, it allows educational There are online learning platforms that utilize affective computing principles to accurately identify six fundamental emotions: happiness, disgust, anger, surprise, sadness, and fear (Aly et al, 2023). Recognizing and addressing this range of emotions allows educational strategies to be more contextualized and effective.…”
Section: Optimizing Learning Through Affective Computingmentioning
confidence: 99%
“…This capability is crucial in education for several reasons. Firstly, it allows educational There are online learning platforms that utilize affective computing principles to accurately identify six fundamental emotions: happiness, disgust, anger, surprise, sadness, and fear (Aly et al, 2023). Recognizing and addressing this range of emotions allows educational strategies to be more contextualized and effective.…”
Section: Optimizing Learning Through Affective Computingmentioning
confidence: 99%
“…It also provides several important pieces of information in communication such as the emotional and mental state of a person and also the intention, attitude, and personality of the communicating person. Several research fields have also tried to replicate this facial expression identification on a computational platform [1]. Simulation of the human face and describing various facial expressions have been subjects of interest in computer graphics and animation.…”
Section: Introductionmentioning
confidence: 99%
“…2, and can be displayed in a number of different ways. However, there are many more expressions than these basic six, and research continues on the identification of different emotions and how they are displayed [1,4]. The ability for a system to differentiate between these emotions and identify them correctly has obvious implications on human-computer interaction and the development of empathetic response in machines.…”
Section: Introductionmentioning
confidence: 99%
“…This capability is crucial in education for several reasons. Firstly, it allows educational There are online learning platforms that utilize affective computing principles to accurately identify six fundamental emotions: happiness, disgust, anger, surprise, sadness, and fear (Aly et al, 2023). Recognizing and addressing this range of emotions allows educational strategies to be more contextualized and effective.…”
Section: Optimizing Learning Through Affective Computingmentioning
confidence: 99%
“…Additionally, the emotion-based artificial decision-making model has been shown to enhance the performance of educational agents in virtual settings (Yang and Zhen, 2014). Another approach involves the integration of emotional agents in AI-based learning environments to improve learner motivation, self-assessment, and self-motivation by improving the socioemotional climate (Gorga and Schneider, 2009), especially affective computing (Kort et al, 2001;González-Hernández et al, 2018;Ninaus et al, 2019;Shobana and Kumar, 2021;He et al, 2022;Aly et al, 2023;Villegas-Ch et al, 2023).…”
Section: Introductionmentioning
confidence: 99%