2018
DOI: 10.1080/00273171.2018.1461602
|View full text |Cite
|
Sign up to set email alerts
|

Recursive Partitioning with Nonlinear Models of Change

Abstract: In this article, we introduce nonlinear longitudinal recursive partitioning (nLRP) and the R package longRpart2 to carry out the analysis. This method implements recursive partitioning (also known as decision trees) in order to split data based on individual- (i.e., cluster) level covariates with the goal of predicting differences in nonlinear longitudinal trajectories. At each node, a user-specified linear or nonlinear mixed-effects model is estimated. This method is an extension of Abdolell et al.'s (2002) l… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
16
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 16 publications
(16 citation statements)
references
References 41 publications
0
16
0
Order By: Relevance
“…The LIP yields the percent improvement in the −2LL from the baseline model for the more parameterized model. The same metric, but scaled in terms of the proportional improvement in model fit, has been implemented in recursive partitioning algorithms for mixed-effects (Abdolell et al, 2002;Stegmann et al, 2018) and structural equation models (Serang et al, 2020). The main reason to do this is to provide context regarding the relative improvement in model fit.…”
Section: An Effect Size Measure For Model Comparisonmentioning
confidence: 99%
“…The LIP yields the percent improvement in the −2LL from the baseline model for the more parameterized model. The same metric, but scaled in terms of the proportional improvement in model fit, has been implemented in recursive partitioning algorithms for mixed-effects (Abdolell et al, 2002;Stegmann et al, 2018) and structural equation models (Serang et al, 2020). The main reason to do this is to provide context regarding the relative improvement in model fit.…”
Section: An Effect Size Measure For Model Comparisonmentioning
confidence: 99%
“…It should be noted that there are other algorithms and software packages that allow for recursive partitioning of GLMM-type models, such as SEM trees (Brandmaier et al, 2013), longRpart (Abdolell et al, 2002) and longRpart2 (Stegmann et al, 2018). In the current paper, we focus on GLMM trees, because it allows for partitioning based on variables measured at both the lowest level (e.g., patient level) as well as higher levels (e.g., therapist, treatment centre, region level).…”
Section: Unbiased Recursive Partitioning and Extension To Multilevel And Longitudinal Datamentioning
confidence: 99%
“…For example, Austin (2012) compare inferences on the effect of in-hospital smoking cessation counselling on subsequent mortality in patients hospitalized with an acute myocardial infarction using ensemble-based methods (bagged regression trees, random forests, and boosted regression trees) to directly estimate average treatment effects by imputing potential outcomes. Stegmann et al (2018) introduce nonlinear Longitudinal Recursive Partitioning (nLRP) and illustrate its use with empirical data from the kindergarten cohort of the Early Childhood Longitudinal Study. Or again, recently, Grimm and Jacobucci (2020) improve the reliability of splitting functions in the case of small data samples, and illustrate their performance using data on depression and suicidal ideation from the National Longitudinal Survey of Youth.…”
Section: Introductionmentioning
confidence: 99%