2022 4th International Conference on Smart Systems and Inventive Technology (ICSSIT) 2022
DOI: 10.1109/icssit53264.2022.9716450
|View full text |Cite
|
Sign up to set email alerts
|

Machine Learning based Prediction of Dropout Students from the Education University using SMOTE

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 26 publications
0
4
0
Order By: Relevance
“…Effective techniques to improve the performance of classifiers in the presence of a class imbalance in training data [42] include data-level approaches such as undersampling, oversampling, and their combinations [43,44]. In the context of this study, the majority class consists of successful students, and this asymmetry can lead a classifier to primarily predict that students do not drop out of the MOOC, which is an undesirable situation as students at risk are incorrectly identified as non-dropouts, resulting in them not receiving the necessary support intervention.…”
Section: Handling Imbalanced Classes In Dropout Predictionmentioning
confidence: 99%
See 2 more Smart Citations
“…Effective techniques to improve the performance of classifiers in the presence of a class imbalance in training data [42] include data-level approaches such as undersampling, oversampling, and their combinations [43,44]. In the context of this study, the majority class consists of successful students, and this asymmetry can lead a classifier to primarily predict that students do not drop out of the MOOC, which is an undesirable situation as students at risk are incorrectly identified as non-dropouts, resulting in them not receiving the necessary support intervention.…”
Section: Handling Imbalanced Classes In Dropout Predictionmentioning
confidence: 99%
“…As a variation of the SMOTE technique, He et al [48] proposed ADASYN (adaptive synthetic sampling method), which generates more synthetic data for the instances that are more difficult to learn [49]. Improved predictive performance of classification models through the combined use of PCA and resampling techniques has been reported (e.g., [42]).…”
Section: Handling Imbalanced Classes In Dropout Predictionmentioning
confidence: 99%
See 1 more Smart Citation
“…The two main approaches to deal with imbalanced target variables is to use an oversampling technique which creates additional minority classes (student records with a course failure) or undersampling techniques to randomly delete majority classes (student records without a course failure). We selected to turn our dataset into a balanced dataset by using an oversampling method called synthetic minority oversampling technique (SMOTE) which is a common method for dealing with models predicting student success in higher education [ 45 , 46 , 47 ]. After performing SMOTE, our dataset will result in students who failed a course being equal to the number of students who did not fail a course.…”
Section: Simulation Of Dataset and Creation Of A Random Forest Machin...mentioning
confidence: 99%