2019
DOI: 10.1007/978-3-030-12939-2_42
|View full text |Cite
|
Sign up to set email alerts
|

End-to-End Learning of Deterministic Decision Trees

Abstract: Conventional decision trees have a number of favorable properties, including interpretability, a small computational footprint and the ability to learn from little training data. However, they lack a key quality that has helped fuel the deep learning revolution: that of being end-to-end trainable, and to learn from scratch those features that best allow to solve a given supervised learning problem. Recent work (Kontschieder 2015) has addressed this deficit, but at the cost of losing a main attractive trait of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 25 publications
2
4
0
Order By: Relevance
“…Therefore, only the most probable path needs to be evaluated at the test time. Similar results were reported in [25], [42] and our experimental results support this single-path inference scheme.…”
Section: Single-path Inferencesupporting
confidence: 92%
See 2 more Smart Citations
“…Therefore, only the most probable path needs to be evaluated at the test time. Similar results were reported in [25], [42] and our experimental results support this single-path inference scheme.…”
Section: Single-path Inferencesupporting
confidence: 92%
“…The training process for probabilistic trees is to maximize the above objective function. A detailed discussion on estimating the optimal parameters (θ * , ω * ) can be found in [25]. Intuitively, by maximizing Eq.…”
Section: Learning Proceduresmentioning
confidence: 99%
See 1 more Smart Citation
“…This work extends our previous conference contribution (Hehn and Hamprecht 2018) where we introduced end-toend learning for decision trees. Here, we add an extension to decision forests, which we compare to state-of-the-art methods for training forests, and provide additional results on interpretability and the effect of the steepness parameter.…”
Section: Contributionssupporting
confidence: 59%
“…Neural Trees With Hard Routing Another type of neural DT is proposed in [11,12], where the forward pass utilizes the routing nodes to make a hard decision, and in this way indeed only the relevant nodes of the tree are visited. [11] introduce a routing function which outputs binary values at test time only, such that the tree still performs soft decisions during training, allowing it to train end-to-end.…”
Section: Related Workmentioning
confidence: 99%