2006
DOI: 10.1016/j.csda.2004.11.006
|View full text |Cite
|
Sign up to set email alerts
|

Mining the customer credit using classification and regression tree and multivariate adaptive regression splines

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
109
0
5

Year Published

2007
2007
2014
2014

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 329 publications
(115 citation statements)
references
References 56 publications
1
109
0
5
Order By: Relevance
“…Because this system is obtained based on previous applicants informat ion, performs similarly for different applicant in the same situations; unlike other personal decision-making methods which are changed by personal opinions. In addit ion, since banks tries preventing the risk of no fundable credits, intelligent systems could recognize the credit with high p robability of default [1,3].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Because this system is obtained based on previous applicants informat ion, performs similarly for different applicant in the same situations; unlike other personal decision-making methods which are changed by personal opinions. In addit ion, since banks tries preventing the risk of no fundable credits, intelligent systems could recognize the credit with high p robability of default [1,3].…”
Section: Introductionmentioning
confidence: 99%
“…In contemporary banking systems, traditional decision making methods have been excluded and intelligent systems are often used [1]. Increasing accuracy, omitt ing personal judgments and decreasing decision making time are the advantages of such systems.…”
Section: Introductionmentioning
confidence: 99%
“…First, the model containing too many basis functions that lead to its overfitting is created. At this stage, it is also possible to take into account interactions between predictors or they can constitute only additive components [7]. At the second stage of the algorithm execution (pruning), these basis functions that contribute least to the goodness-of-fit are removed [8].…”
Section: Multivariate Adaptive Regression Splinesmentioning
confidence: 99%
“…19−21 Compared to other statistical tools such as linear regression and discriminant analysis, decision tree offers several major advantages, including the flexibility with handling of different types of response variables, the capability of identifying complicated relationship without strong model assumptions, the robustness with respect to outliers, the ability to deal with missing values, and easy result interpretation. 17,22 Classic decision trees such as the classification and regression tree (CART) 23 and "C5" 24 are constructed by selecting the best split measured by the Gini index or information gain through exhaustive search over all possible variables to split and all possible places for a split. These algorithms suffer from two major limitations: overfitting, 25 and biased variable selection, that is, the preference for variables with more categories and the continuous variables.…”
Section: ■ Introductionmentioning
confidence: 99%