2019
DOI: 10.1007/s11263-019-01237-6
|View full text |Cite
|
Sign up to set email alerts
|

End-to-End Learning of Decision Trees and Forests

Abstract: Conventional decision trees have a number of favorable properties, including a small computational footprint, interpretability, and the ability to learn from little training data. However, they lack a key quality that has helped fuel the deep learning revolution: that of being end-to-end trainable. Kontschieder et al. (ICCV, 2015) have addressed this deficit, but at the cost of losing a main attractive trait of decision trees: the fact that each sample is routed along a small subset of tree nodes only. We here… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
31
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 42 publications
(31 citation statements)
references
References 45 publications
0
31
0
Order By: Relevance
“…Hard Tree selects a subpath for instances according to a specific feature and threshold. In the multivariate tree (Irsoy et al, 2012;Norouzi et al, 2015;Irsoy et al, 2014;Hehn et al, 2019), which is also called soft tree, i is a continuous variable and s(x; i ) defines an oblique split.…”
Section: Soft Treementioning
confidence: 99%
See 3 more Smart Citations
“…Hard Tree selects a subpath for instances according to a specific feature and threshold. In the multivariate tree (Irsoy et al, 2012;Norouzi et al, 2015;Irsoy et al, 2014;Hehn et al, 2019), which is also called soft tree, i is a continuous variable and s(x; i ) defines an oblique split.…”
Section: Soft Treementioning
confidence: 99%
“…Being aware of the benefits of discretization in terms of interpretability, End2End Tree (Hehn et al, 2019) proposes a multivariate discrete tree. End2End Tree is fully probabilistic at train time but becomes deterministic at test time after an annealing process.…”
Section: Soft Treementioning
confidence: 99%
See 2 more Smart Citations
“…Since each person has multiple sliding windows, here we take the number of windows for every person in each class as the new features for classification. The decision tree is used as the classifier ( Douglas et al, 2011 ), where the subject searches along a single path from the root to the leaf, and the path depends on the characteristics of the sample ( Hehn et al, 2019 ). The decision tree can handle uneven data without standardizing and quantifying the data, and the logic is simple and intuitive.…”
Section: Introductionmentioning
confidence: 99%