2000
DOI: 10.1142/s0218213000000070
|View full text |Cite
|
Sign up to set email alerts
|

Stable Decision Trees: Using Local Anarchy for Efficient Incremental Learning

Abstract: This work deals with stability in incremental induction of decision trees. Stability problems arise when an induction algorithm must revise a decision tree very often and oscillations between similar concepts decrease learning speed. We introduce a heuristic and an algorithm with theoretical and experimental backing to tackle this problem.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 6 publications
0
5
0
Order By: Relevance
“…We note that, while this approach has merits, it is in fact orthogonal to our approach; we do not deal with intermediate trees but with data that has to reside in the tree so that fitness can be reconstructed. Nevertheless, to factor in an estimate of when to apply one heuristic and to make that decision on the fly has been already demonstrated to be beneficial in incremental decision tree induction (Kalles & Papagelis, 2000).…”
Section: Discussionmentioning
confidence: 99%
“…We note that, while this approach has merits, it is in fact orthogonal to our approach; we do not deal with intermediate trees but with data that has to reside in the tree so that fitness can be reconstructed. Nevertheless, to factor in an estimate of when to apply one heuristic and to make that decision on the fly has been already demonstrated to be beneficial in incremental decision tree induction (Kalles & Papagelis, 2000).…”
Section: Discussionmentioning
confidence: 99%
“…To achieve hiding by minimally modifying the original dataset, we may interpret "minimally" concerning dataset changes or whether the sanitized decision tree produced via hiding is syntactically close to the original. Measuring minimality in how one modifies decision trees has been studied regarding heuristics that guarantee or approximate the impact of changes [19][20][21].…”
Section: Methodsmentioning
confidence: 99%
“…To achieve hiding by minimally modifying the original data set, we can interpret “minimal” changes in data sets or whether the sanitized decision tree generated through hiding is syntactically close to the original with minimum modification of the initial data set. The minimum measurement of how decision-making trees are changed has been examined in the context of heuristics to ensure or approximate the effect of changes [ 22 , 23 , 24 ]. In our examples, we use the measure of the kappa statistic to compare the efficiency of the deduced decision tree after the proposed modification with the original one.…”
Section: Methodsmentioning
confidence: 99%