2006
DOI: 10.1509/jmkr.43.2.276
|View full text |Cite
|
Sign up to set email alerts
|

Bagging and Boosting Classification Trees to Predict Churn

Abstract: Department of Applied Economics, K.U. Leuven. The authors are grateful to Marnik Dekimpe for his valuable and helpful comments and to the Teradata Center for Customer Relationship Management at Duke University for the data and remarks. They also wish to thank the two anonymous JMR reviewers for their constructive comments. This research was funded by the Research Fund at K.U. Leuven and the Fonds voor Wetenschappelijk Onderzoek (Contract No. G.0385. 03).

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
204
2
16

Year Published

2009
2009
2021
2021

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 336 publications
(229 citation statements)
references
References 48 publications
4
204
2
16
Order By: Relevance
“…It will be interesting to compare both approaches in future research. In the spirit of [27], a marketing study that finds bagging and boosting techniques to perform well for customer behavior prediction, and [1] who find Random Forests with SMOTE to perform best for purchase prediction in free-to-play games, we adopt tree-based (ensemble) methods combined with approaches to deal with the strong imbalance in the data [12]. We also include regression techniques as these have been reported to work well [15,17].…”
Section: Predicting Ltv In Non-contractual Freemium Settingsmentioning
confidence: 99%
“…It will be interesting to compare both approaches in future research. In the spirit of [27], a marketing study that finds bagging and boosting techniques to perform well for customer behavior prediction, and [1] who find Random Forests with SMOTE to perform best for purchase prediction in free-to-play games, we adopt tree-based (ensemble) methods combined with approaches to deal with the strong imbalance in the data [12]. We also include regression techniques as these have been reported to work well [15,17].…”
Section: Predicting Ltv In Non-contractual Freemium Settingsmentioning
confidence: 99%
“…Churn is usually a rare event [30]. Many techniques have been proposed to deal with class imbalance in churn prediction [8,9]. In [31], the effect of class imbalance on a number of performance metrics is analyzed for probability estimation trees.…”
Section: Datamentioning
confidence: 99%
“…In this study, the use of ensemble learning for churn prediction is considered. Applications of ensemble classifiers to churn prediction include Random Forests [5], AdaBoost [6], AdaCost [7], Bagging [8], Stochastic Gradient Boosting [9], ensembles of Artificial Neural Networks [4] and the multi-classifier class-combiner technique [10]. These studies all demonstrate the beneficial impact of using ensemble classifiers over single classifiers for classification performance in the context of churn prediction.…”
Section: Introductionmentioning
confidence: 98%
See 1 more Smart Citation
“…The individual predictions are combined via weighted voting. The popular combination methods of bagging [2] and boosting [6] are tested on a mixture of customer and contractual data in [8].…”
Section: Related Workmentioning
confidence: 99%