2014
DOI: 10.1111/biom.12221
|View full text |Cite
|
Sign up to set email alerts
|

Simultaneous Variable Selection for Joint Models of Longitudinal and Survival Outcomes

Abstract: Summary Joint models of longitudinal and survival outcomes have been used with increasing frequency in clinical investigations. Correct specification of fixed and random effects is essential for practical data analysis. Simultaneous selection of variables in both longitudinal and survival components functions as a necessary safeguard against model misspecification. However, variable selection in such models has not been studied. No existing computational tools, to the best of our knowledge, have been made avai… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
34
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 33 publications
(34 citation statements)
references
References 29 publications
(33 reference statements)
0
34
0
Order By: Relevance
“…Such a latent variable could be continuous, e.g., random effects 5–11 , or discrete, e.g., latent class 12,13 , or both 14 . Recently, joint models have also been incorporated in the machine learning framework through statistical boosting 15 and variable selection by adaptive LASSO 16 . In this paper we will focus on shared random effects models, though the extension to shared latent class models is straightforward in principle.…”
Section: Introductionmentioning
confidence: 99%
“…Such a latent variable could be continuous, e.g., random effects 5–11 , or discrete, e.g., latent class 12,13 , or both 14 . Recently, joint models have also been incorporated in the machine learning framework through statistical boosting 15 and variable selection by adaptive LASSO 16 . In this paper we will focus on shared random effects models, though the extension to shared latent class models is straightforward in principle.…”
Section: Introductionmentioning
confidence: 99%
“…However, Wang and Tsai showed that GCV approach tended to choose the tuning parameters that lead to an overfitted model. Following Wang and Tsai, Ha et al, and He et al, we use a BIC‐type criterion based on the penalized log‐likelihood for tuning parameter selection: BICfalse(bold-italicρfalse)=2lofalse(trueζ^false)+normallnormalonormalgfalse(nfalse)×k, where trueζ^ are the estimators obtained by maximizing at a given tuning parameter combination, denoted as ρ =( ρ β , ρ γ ), and lofalse(trueζ^false) is the value of evaluated at the estimated trueζ^. The k is the total number of nonzero estimates of trueζ^.…”
Section: Linear and Nonlinear Variables Selectionmentioning
confidence: 99%
“…The off‐diagonal penalty on a precision matrix was considered by Zhang and Zou for graphical models. We note that He et al imposed penalties on the diagonal elements of covariances matrix since their goal is to select the random covariates. Our goal is not to select random intercepts and slopes in the submodel of longitudinal variables.…”
Section: Variable Selection Through Penalized Likelihoodmentioning
confidence: 99%
“…We are interested in the association between trajectories of multiple longitudinal processes and survival time. Therefore, the joint model we consider in this article is different from that in He et al…”
Section: Introductionmentioning
confidence: 98%
See 1 more Smart Citation