2023 1st International Conference on Advanced Innovations in Smart Cities (ICAISC) 2023
DOI: 10.1109/icaisc56366.2023.10085183
|View full text |Cite
|
Sign up to set email alerts
|

Telecom Churn Analysis using Machine Learning in Smart Cities

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 21 publications
0
0
0
Order By: Relevance
“…The reported experimental results demonstrated that the ensemble learning techniques (i.e., the AdaBoost classifier and XGBoost classifier) provide optimal accuracy with an area-under-curve (AUC) score of 84% in the churn prediction task. The work in [5] manages telecom customer data and proposes a churn prediction model based on machine learning algorithms. Prior data prediction, preparation, and cleaning are performed to prepare quality data for machine learning.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The reported experimental results demonstrated that the ensemble learning techniques (i.e., the AdaBoost classifier and XGBoost classifier) provide optimal accuracy with an area-under-curve (AUC) score of 84% in the churn prediction task. The work in [5] manages telecom customer data and proposes a churn prediction model based on machine learning algorithms. Prior data prediction, preparation, and cleaning are performed to prepare quality data for machine learning.…”
Section: Related Workmentioning
confidence: 99%
“…There are a plethora of machine learning classifiers that have been proposed to analyze customer data for predicting customer churn. These include single classifiers, such as support vector machines, naïve Bayes, decision trees, logistic regression, and k-nearest neighbors, and ensemble classifiers, such as AdaBoost, gradient boosting, XGBoost, CatBoost, and random forests [1,[5][6][7][8][9][10][11][12]. It has been asserted that ensemble classifiers perform better than single classifiers [13].…”
Section: Introductionmentioning
confidence: 99%