2019
DOI: 10.1108/ils-03-2019-0017
|View full text |Cite
|
Sign up to set email alerts
|

Using educational data mining techniques to increase the prediction accuracy of student academic performance

Abstract: Purpose This paper aims to evaluate educational data mining methods to increase the predictive accuracy of student academic performance for a university course setting. Student engagement data collected in real time and over self-paced activities assisted this investigation. Design/methodology/approach Classification data mining techniques have been adapted to predict students’ academic performance. Four algorithms, Naïve Bayes, Logistic Regression, k-Nearest Neighbour and Random Forest, were used to generat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0
5

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 40 publications
(36 citation statements)
references
References 26 publications
0
23
0
5
Order By: Relevance
“…Techniques Highest accuracy appeared [43] J48 100% [46] NavieBayes(NB) 98.86% [40] X-Means 86.17% [35] Support Vector Machine (SVM) 97.98% [14] Ctree 90.37% [49] Decision Tree(DT) 67% [38] Random Forest(RF) 96.4% [34] Logistic Regression(LA) 96.98% [45] Neural Network(NN) 96% [33] K-means 98.9% [36] Rule-Based 71.3% [6] CART 98.3% [4] RepTree 61.4% [6] Iterative Dichotomiser 3(ID3) 95.9% [39] IBK 82.1% [39] Simple Logistic 93.27% [44] JRip 83.46% [35] K-Medoids 84.04%…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Techniques Highest accuracy appeared [43] J48 100% [46] NavieBayes(NB) 98.86% [40] X-Means 86.17% [35] Support Vector Machine (SVM) 97.98% [14] Ctree 90.37% [49] Decision Tree(DT) 67% [38] Random Forest(RF) 96.4% [34] Logistic Regression(LA) 96.98% [45] Neural Network(NN) 96% [33] K-means 98.9% [36] Rule-Based 71.3% [6] CART 98.3% [4] RepTree 61.4% [6] Iterative Dichotomiser 3(ID3) 95.9% [39] IBK 82.1% [39] Simple Logistic 93.27% [44] JRip 83.46% [35] K-Medoids 84.04%…”
Section: Discussionmentioning
confidence: 99%
“…In 2019, Ramaswami et al [38] used four data mining techniques that included Logistic Regression, Random Forest, k-Nearest Neighbour and Naïve Bayes. The authors used different techniques in order to improve the prediction accuracy of students' performance by using Python.…”
Section: A Prediction Of Students' Performancementioning
confidence: 99%
“…Additionally, a study used two data sets i.e., the academic year (2016-2017) over a 12-week semester for 240 students, and the other one, consisting of students' assessment scores for predicting the performance [20]. Several algorithms were used such as NB, RF, K-nearest neighbor (KNN), and LR, respectively.…”
Section: Related Studiesmentioning
confidence: 99%
“…Discovery with Models, 687 | P a g e www.ijacsa.thesai.org according to [35], finding with frameworks usually implies the substantiated adaptation of a forecasting model throughout several environments. The principal application of this EDM class is the finding of relationships among student conduct and subjective variables in the teaching environment [32].…”
Section: Education Data Mining (Edm)mentioning
confidence: 99%
“…BD adoption and widespread use in other regions, particularly in poor countries, has been bolstered as a result of this development. In accordance with [32], the diffusion of BD gives an account of how BD goes from discovery to widespread use, and how this is aided by steps taken by service providers of important technologies necessary to enhance the resources and capacities of academic institutions. [34] Stated that countries may take advantage of the numerous BD opportunities that are accessible to them in order to gain value from the huge volumes of data that are generated and, in the long term, aid in their development.…”
Section: Proposed Framework Future Research and Limitationsmentioning
confidence: 99%