2022
DOI: 10.1016/j.datak.2022.102088
|View full text |Cite
|
Sign up to set email alerts
|

On the explanatory power of Boolean decision trees

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(10 citation statements)
references
References 45 publications
0
9
0
Order By: Relevance
“…Later works showed the same complexity for decision graphs [47] and some classes of tractable circuits [14], [48]. The generation of sufficient reasons for decision trees was also studied in [49], including the generation of shortest sufficient reasons which was shown to be hard even for a single reason. The generation of shortest sufficient reasons was also studied in a broader context that includes decision graphs and SDDs [4].…”
Section: Discussionmentioning
confidence: 98%
See 1 more Smart Citation
“…Later works showed the same complexity for decision graphs [47] and some classes of tractable circuits [14], [48]. The generation of sufficient reasons for decision trees was also studied in [49], including the generation of shortest sufficient reasons which was shown to be hard even for a single reason. The generation of shortest sufficient reasons was also studied in a broader context that includes decision graphs and SDDs [4].…”
Section: Discussionmentioning
confidence: 98%
“…13 The complexity of shortest sufficient reasons was studied in [55] for Boolean classifiers which correspond to decision graphs and for neural networks with ReLU activation functions. It was further shown that the number of necessary reasons is linear in the decision tree size [4], [47], that all such reasons can be computed in polynomial time [4], [49], and that the shortest necessary reasons can be enumerated with polynomial delay if the classifier satisfies some conditions as stated in [56]. Further complexity results were shown in [14], [48], where classifiers where categorized based on the tractable circuits that represent them [48] or the kinds of processing they permit in polynomial time [14].…”
Section: Discussionmentioning
confidence: 99%
“…Besides the clustering, determining both the impact of various features on the clusters, and their impact on the estimation of the environmental performance is essential to determine the correlation of the features. Decision tree approaches have performed well in providing an explanatory framework for deep learning frameworks for the resulting outputs (Audemard, et al, 2022).…”
Section: Methodsmentioning
confidence: 99%
“…Its objective is to create a model for predicting a given value by learning simple rules deduced from the characteristics of the data. The classification process is applied through a set of rules or conditions that determine the path followed from the root node and ends with one of the final nodes representing the final decision [19]. For all non-final nodes, a decision should be made about the next node.…”
Section: Decision Treementioning
confidence: 99%
“…The decision is to choose a solution from several solutions to a particular problem. Therefore, decision-making is the choice of one of the available alternatives, so The decision-making process is a series of stages and procedures that lead to selecting the best alternative options in the end [19,20].…”
Section: Decision Treementioning
confidence: 99%