2008
DOI: 10.1198/106186008x319331
|View full text |Cite
|
Sign up to set email alerts
|

Model-Based Recursive Partitioning

Abstract: Model-based recursive partitioning PaperOriginal Citation: Zeileis, Achim and Hothorn, Torsten and Hornik, Kurt (2005) Kurt HornikWirtschaftsuniversität Wien AbstractRecursive partitioning is embedded into the general and well-established class of parametric models that can be fitted using M-type estimators (including maximum likelihood). An algorithm for model-based recursive partitioning is suggested for which the basic steps are:(1) fit a parametric model to a data set, (2) test for parameter instability… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
629
0
4

Year Published

2013
2013
2023
2023

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 585 publications
(634 citation statements)
references
References 41 publications
1
629
0
4
Order By: Relevance
“…In addition, tree models do not assume a linear relationship between predictors and the dependent variable and they are very useful for modelling higher-order interaction effects between predictor variables automatically. For this study we used a particular family of tree models called conditional inference trees that combine the rigorous theory of permutation statistics (Hothorn, Hornik, & Zeileis, 2006) with the principle of recursive partitioning (Zeileis, Hothorn, & Hornik, 2008).…”
Section: Exploratory Analysis (Regression Tree Model)mentioning
confidence: 99%
“…In addition, tree models do not assume a linear relationship between predictors and the dependent variable and they are very useful for modelling higher-order interaction effects between predictor variables automatically. For this study we used a particular family of tree models called conditional inference trees that combine the rigorous theory of permutation statistics (Hothorn, Hornik, & Zeileis, 2006) with the principle of recursive partitioning (Zeileis, Hothorn, & Hornik, 2008).…”
Section: Exploratory Analysis (Regression Tree Model)mentioning
confidence: 99%
“…Repeat steps 1-3 recursively in the resulting nodes until there are no more significant instabilities (or the number of data sets left in a node falls below a given stopping value). The statistical framework employed for testing the significance of instabilities in the model parameters is described in detail in [19] and [24]. The rationale of the underlying tests for parameter instability is that the individual deviations from a joint model are considered over the range of each potential splitting variable: As illustrated in Figure 4, the strength parameter for algorithm a k may show a systematic change when considered over the range of the characteristic c j -such as the first canonical correlation, that indicates linear separability -while over the range of other characteristics the parameter may vary only randomly.…”
Section: Preference Scalingmentioning
confidence: 99%
“…The rationale of the underlying tests for parameter instability is that the individual deviations from a joint model are considered over the range of each potential splitting variable: As illustrated in Figure 4, the strength parameter for algorithm a k may show a systematic change when considered over the range of the characteristic c j -such as the first canonical correlation, that indicates linear separability -while over the range of other characteristics the parameter may vary only randomly. Statistical tests are available to detect such systematic parameter instabilities and select the splitting variable or characteristic inducing the strongest instability [19,24].…”
Section: Preference Scalingmentioning
confidence: 99%
“…Aiming to identify a lung miRNA expression signature predictive of ALK, EGFR, and KRAS mutational status, we used the Conditional Inference classification Trees (CTree) implemented in the Bioconductor package party (10); the prediction accuracy of the classification algorithm was estimated using 10-fold cross-validation (11).…”
Section: Alk- Egfr- Kras-drivenmentioning
confidence: 99%