2019
DOI: 10.1007/978-3-030-36365-9_25
|View full text |Cite
|
Sign up to set email alerts
|

Employee Turnover Prediction Using Machine Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(7 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…The Extreme Gradient Boosting (XGB) ( 27 ) model is an accurate and scalable version of boosting-tree models, with an aim to optimize the computing efficiency and predictive accuracy. XGB also uses regularization to control overfitting ( 28 ).…”
Section: Methodsmentioning
confidence: 99%
“…The Extreme Gradient Boosting (XGB) ( 27 ) model is an accurate and scalable version of boosting-tree models, with an aim to optimize the computing efficiency and predictive accuracy. XGB also uses regularization to control overfitting ( 28 ).…”
Section: Methodsmentioning
confidence: 99%
“…Sampe et al (2019) used LR and ANN via human resources data provided from Kaggle for the employee turnover prediction. Alaskar et al (2019) presented a comparison analysis of combination of five machine learning algorithms such as logistic regression, NB, DT, SVM, and AdaBoost and three feature selection methods such as selectkbest, recursive feature elimination (RFE), and RF for employee turnover prediction problem. In this study, experimental results showed that the two best models were DT with SelectKBest and the SVM–polynomial kernel using RF.…”
Section: Literature Reviewmentioning
confidence: 99%
“…As a scalable and accurate implementation of gradient boosted trees, optimized computational speed and model performance can be achieved. Reduction in the overfitting effect can be realized by XGB utilizing a regularization term compared to gradient boost, which is contributed to better prediction and much faster computational run times [63]. • The CatBoost Classifier [64] is a ML algorithm that uses gradient boosting on DTs.…”
Section: ) Model Trainingmentioning
confidence: 99%