Proceedings of the 2010 ACM Symposium on Applied Computing 2010
DOI: 10.1145/1774088.1774327
|View full text |Cite
|
Sign up to set email alerts
|

Evolutionary model tree induction

Abstract: Model trees are a particular case of decision trees employed to solve regression problems. They have the advantage of presenting an interpretable output with an acceptable level of predictive performance. Since generating optimal model trees is a NPComplete problem, the traditional model tree induction algorithms make use of a greedy heuristic, which may not converge to the global optimal solution. We propose the use of the evolutionary algorithms paradigm (EA) as an alternate heuristic to generate model trees… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2011
2011
2015
2015

Publication Types

Select...
6
1

Relationship

5
2

Authors

Journals

citations
Cited by 13 publications
(16 citation statements)
references
References 21 publications
0
16
0
Order By: Relevance
“…is the standard deviation, D is the portion of the instances that reaches the node being tested and Di is the portion of the instances that results from partitioning the node. This approach has two main advantages [4,6]: i) the threshold values are defined in a data-driven manner instead of selecting random values, which is the case of most evolutionary approaches; and ii) we can achieve a certain degree of heterogeneity by partitioning the training set (in this case, the imputed sub-training set) into different pieces, increasing the chances of selecting a good threshold value.…”
Section: Generating the Initial Populationmentioning
confidence: 99%
“…is the standard deviation, D is the portion of the instances that reaches the node being tested and Di is the portion of the instances that results from partitioning the node. This approach has two main advantages [4,6]: i) the threshold values are defined in a data-driven manner instead of selecting random values, which is the case of most evolutionary approaches; and ii) we can achieve a certain degree of heterogeneity by partitioning the training set (in this case, the imputed sub-training set) into different pieces, increasing the chances of selecting a good threshold value.…”
Section: Generating the Initial Populationmentioning
confidence: 99%
“…Barros et al [46], [47] propose an EA called E-Motion (Evolutionary MOdel Tree InductiON) for axis-parallel model tree induction, where each individual is represented as a tree of variable shape and size.The initialization of individuals is domain knowledge-based, as it combines single nodes whose attribute tests are dictated by the expectation of standard deviation reduction (SDR), given by…”
Section: B Model Treesmentioning
confidence: 99%
“…Nevertheless, many works affirm to present similar predictive performance to baseline algorithms while providing smaller trees (e.g., [17], [46], [47], [100], [102], [103], [105], [111]). Only one work [120] states that the proposed EA generates larger trees than its baseline algorithm (C4.5).…”
Section: Performance Analysesmentioning
confidence: 99%
“…The decision trees produced by this strategy are usually fully balanced, where the distance from the root to any leaf node is the same. This generation procedure is called the full method (see, for instance, [71,66,72,68,44,6,7]). Trees can also be generated with varying distances from the root to the leaves, named the grow method (e.g., [48]).…”
Section: Related Workmentioning
confidence: 99%