2021
DOI: 10.1007/978-3-030-86523-8_1
|View full text |Cite
|
Sign up to set email alerts
|

Deep Conditional Transformation Models

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

3
5

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 24 publications
0
8
0
Order By: Relevance
“…To allow for a more nuanced view on uncertainty, we follow Hüllermeier and Waegeman (2021) and distinguish between two sources of uncertainty for any machine learning task: while epistemic uncertainty accounts for uncertainty in the model that can generally be reduced given sufficient data, this paper is concerned with aleatoric uncertainty, which reflects the randomness inherent in observations. This type of uncertainty can be captured by modelling a conditional outcome probability distribution that is dependent on features (Baumann et al, 2021).…”
Section: Dgbm: Distributional Gradient Boosting Machinesmentioning
confidence: 99%
See 2 more Smart Citations
“…To allow for a more nuanced view on uncertainty, we follow Hüllermeier and Waegeman (2021) and distinguish between two sources of uncertainty for any machine learning task: while epistemic uncertainty accounts for uncertainty in the model that can generally be reduced given sufficient data, this paper is concerned with aleatoric uncertainty, which reflects the randomness inherent in observations. This type of uncertainty can be captured by modelling a conditional outcome probability distribution that is dependent on features (Baumann et al, 2021).…”
Section: Dgbm: Distributional Gradient Boosting Machinesmentioning
confidence: 99%
“…that is learnt from the data (Baumann et al, 2021). However, instead of a transformation from z to y, Transformation Models define an inverse flow h(y) = z (Rügamer et al, 2022).…”
Section: B Nfboost: Normalizing Flow Boostingmentioning
confidence: 99%
See 1 more Smart Citation
“…Recent approaches (Rügamer et al, 2020) suggest to fit structured regression models as (part of of) a neural network. This can result in a better space complexity in situations with many data points and allows for more flexibility in the additive predictors of models beyond those of classical GAMs (see, e.g., Baumann et al, 2021;Kopper et al, 2021).…”
Section: Related Literaturementioning
confidence: 99%
“…A recent trend is the combination of neural networks with statistical regression models in various ways [41,42,43,44,45,46,47]. In this work, we make use of semi-structured deep distributional regression [SDDR,48].…”
Section: Semi-structured Deep Distributional Regressionmentioning
confidence: 99%