1989
DOI: 10.1016/0020-7373(89)90027-8
|View full text |Cite
|
Sign up to set email alerts
|

Extensions to the CART algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
45
0

Year Published

1997
1997
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 121 publications
(47 citation statements)
references
References 4 publications
2
45
0
Order By: Relevance
“…Just as many training parameters in an artificial neural network must be determined empirically for a particular application [54], these parameters for a probabilistic network can be determined empirically using some training and testing data. A decision tree can be used as a classifier or to reveal the underlying patterns of data [59][60][61]. As a classifier, a decision tree is used to divide a problem region into several sub-regions with all the examples (data points) in one sub-region having the same target value.…”
Section: -4 Results and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Just as many training parameters in an artificial neural network must be determined empirically for a particular application [54], these parameters for a probabilistic network can be determined empirically using some training and testing data. A decision tree can be used as a classifier or to reveal the underlying patterns of data [59][60][61]. As a classifier, a decision tree is used to divide a problem region into several sub-regions with all the examples (data points) in one sub-region having the same target value.…”
Section: -4 Results and Discussionmentioning
confidence: 99%
“…CHAID uses a chi-square test to find the most significant split points on the predictor variables. GINI is essentially CART [59][60][61] with the GINI index. The GINI impurity index is defined as the possibility at a test node that a randomly chosen case is classified incorrectly.…”
Section: -4 Results and Discussionmentioning
confidence: 99%
“…An early incremental decision tree algorithm was proposed by Crawford (1989) based on CART (Breiman et al 1984). When a new training instance would cause a new test to be picked at a decision node, the entire subtree rooted at this node is discarded and rebuilt based on the corresponding subset of the training examples.…”
Section: Incremental Decision Treesmentioning
confidence: 99%
“…Schlimmer and Fisher's (1986) ID4 demonstrated incremental tree induction through test revision and discarding of subtrees. Crawford (1989) has constructed an incremental version of the CART algorithm (Breiman, Friedman, Olshen & Stone, 1984). When a new example is received, if a new test would be picked at a decision node, a new subtree with the new test is constructed by building the new subtree from scratch from the corresponding subset of the training examples.…”
Section: Related Workmentioning
confidence: 99%