2021
DOI: 10.1080/10618600.2021.1872581
|View full text |Cite
|
Sign up to set email alerts
|

Predictive Distribution Modeling Using Transformation Forests

Abstract: Regression models for supervised learning problems with a continuous response are commonly understood as models for the conditional mean of the response given predictors. This notion is simple and therefore appealing for interpretation and visualization. Information about the whole underlying conditional distribution is, however, not available from these models. A more general understanding of regression models as models for conditional distributions allows much broader inference, for example, the computation … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
42
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(42 citation statements)
references
References 38 publications
0
42
0
Order By: Relevance
“…As splitting procedures, that are internally used to construct trees, can detect changes in the mean only, standard tree-based implementations are not able to recognize any distributional changes (e.g., change of variance), even if these can be related to covariates (Hothorn and Zeileis, 2021). As such, basic versions of XGBoost and LightGBM don't provide a way to model the full predictive distribution F Y (y|x), as they focus on predicting the conditional mean E(Y |X = x) only.…”
Section: About Here]mentioning
confidence: 99%
“…As splitting procedures, that are internally used to construct trees, can detect changes in the mean only, standard tree-based implementations are not able to recognize any distributional changes (e.g., change of variance), even if these can be related to covariates (Hothorn and Zeileis, 2021). As such, basic versions of XGBoost and LightGBM don't provide a way to model the full predictive distribution F Y (y|x), as they focus on predicting the conditional mean E(Y |X = x) only.…”
Section: About Here]mentioning
confidence: 99%
“…There are strong links between the general model (3) and the so-called transformation models proposed by Hothorn et al (2014), Hothorn et al (2018), which have been extended to transformation forests more recently by Hothorn and Zeileis (2021). Transformation models assume P (Y ≤ y) = F (h(y)), where h(.)…”
Section: Motivation and Basic Modelmentioning
confidence: 99%
“…is an unknown increasing transformation function. The transformation function can be additive (Hothorn et al, 2014) or nonparametric (Hothorn and Zeileis, 2021). Although the models look quite similar there are some crucial differences.…”
Section: Motivation and Basic Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…These approaches differ according to 1) the method used to build the forest and 2) the method used to build the prediction interval. There are four methods to build the forest, three from the classification and regression tree (CART) paradigm (Breiman and Breiman, 1984) and the transformation forest method (TRF) proposed by Hothorn and Zeileis (2021). Within the CART paradigm, in addition to the default least-squares (LS) splitting criterion, two alternative splitting criteria, L 1 and shortest prediction interval (SPI), are considered.…”
Section: Introductionmentioning
confidence: 99%