2017
DOI: 10.18178/ijmlc.2017.7.6.641
|View full text |Cite
|
Sign up to set email alerts
|

Decision Tree Pruning via Multi-Objective Evolutionary Computation

Abstract: To date, decision trees are among the most used classification models. They owe their popularity to their efficiency during both the learning and the classification phases and, above all, to the high interpretability of the learned classifiers. This latter aspect is of primary importance in those domains in which understanding and validating the decision process is as important as the accuracy degree of the prediction. Pruning is a common technique used to reduce the size of decision trees, thus improving thei… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 19 publications
0
1
0
Order By: Relevance
“…This method is used to create a balance between the accuracy and complexity of the tree. However, the computational overhead due to the processing after the tree construction is regarded as the main limitation [10]. Since the tree must be completely constructed and a lot of time and memory should be allocated to the tree construction despite a large amount of data during the post-pruning methods, the pre-pruning method was implemented in the present study.…”
Section: Decision Tree Pruningmentioning
confidence: 99%
“…This method is used to create a balance between the accuracy and complexity of the tree. However, the computational overhead due to the processing after the tree construction is regarded as the main limitation [10]. Since the tree must be completely constructed and a lot of time and memory should be allocated to the tree construction despite a large amount of data during the post-pruning methods, the pre-pruning method was implemented in the present study.…”
Section: Decision Tree Pruningmentioning
confidence: 99%