2017
DOI: 10.1155/2017/3162571
|View full text |Cite
|
Sign up to set email alerts
|

Forest Pruning Based on Branch Importance

Abstract: A forest is an ensemble with decision trees as members. This paper proposes a novel strategy to pruning forest to enhance ensemble generalization ability and reduce ensemble size. Unlike conventional ensemble pruning approaches, the proposed method tries to evaluate the importance of branches of trees with respect to the whole ensemble using a novel proposed metric called importance gain. The importance of a branch is designed by considering ensemble accuracy and the diversity of ensemble members, and thus the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 24 publications
(31 reference statements)
0
6
0
Order By: Relevance
“…Experiments conducted showed the strength of PEP over state of the art pruning algorithms, and PEP has recently been shown to be effective for detecting malware in Android systems (Fan, Xue, Chen, Xu, & Zhu, 2016). Forest pruning (FP; Jiang, Wu, & Guo, 2017) is a recently developed pruning method that focuses on an ensemble of decision trees in which trees' branches are pruned based on a novel metric called branch importance that indicates the importance of individual branches and nodes of the ensemble.…”
Section: Ensemble Pruning Methodsmentioning
confidence: 99%
“…Experiments conducted showed the strength of PEP over state of the art pruning algorithms, and PEP has recently been shown to be effective for detecting malware in Android systems (Fan, Xue, Chen, Xu, & Zhu, 2016). Forest pruning (FP; Jiang, Wu, & Guo, 2017) is a recently developed pruning method that focuses on an ensemble of decision trees in which trees' branches are pruned based on a novel metric called branch importance that indicates the importance of individual branches and nodes of the ensemble.…”
Section: Ensemble Pruning Methodsmentioning
confidence: 99%
“…In addition, ARF (Ye et al , 2017) has fewer errors and more stability than classical RF, same thing, (Yang et al , 2016) presents improved random forests (IRF) which increases the resistance of RF and improves prediction accuracy. There is also another work (Jiang et al , 2017) that presents a new pruning method based on the importance of tree branches, and this type of method can improve the accuracy of the classical RF (Dheenadayalan et al , 2016), and reduce, in particular, the size of the tree, and in general size of the RF.…”
Section: Related Workmentioning
confidence: 99%
“…Next, RF will be covered in more detail, while details on the other two methods can be found in the literature. Breiman introduced RF in 2001 as a classifier that is made up of multiple tree-structured classifiers (decision trees) { h ( x , θ k ), k = 1, ...}, where the {θ k } are independent identically distributed random vectors, and each tree casts a unit vote for the most popular class at input x . , Each RF consists of a root, nodes, and leaves, with each split representing two branches at each node (Figure ). Each node in an RF represents the decision made by the model which is based on a subset of the available features, while the leaves represent the outputs of the RF. , These outputs, which are predictions of each tree, are combined by taking the most common prediction across all trees. ,, …”
Section: Model Typesmentioning
confidence: 99%