2018
DOI: 10.1111/biom.12869
|View full text |Cite
|
Sign up to set email alerts
|

Model Selection for Semiparametric Marginal Mean Regression Accounting for Within-Cluster Subsampling Variability and Informative Cluster Size

Abstract: We propose a model selection criterion for semiparametric marginal mean regression based on generalized estimating equations. The work is motivated by a longitudinal study on the physical frailty outcome in the elderly, where the cluster size, that is, the number of the observed outcomes in each subject, is "informative" in the sense that it is related to the frailty outcome itself. The new proposal, called Resampling Cluster Information Criterion (RCIC), is based on the resampling idea utilized in the within-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 16 publications
0
4
0
Order By: Relevance
“…The RCSIC is then equivalent to the RCIC. Under the situations with constant cluster size (n) considered, the second and third terms of the RCSIC perspicuously reduces to a model dimension p multiplied by a cluster size penalty term (n+1) for the model selection criterion 32 . Since the RCSIC puts a larger penalty on model dimension p than the traditional Akaike information criterion, 37 the RCSIC can reduce risks for choosing an overfitted model, in comparison with the traditional Akaike information criterion that is known to easily prefer selecting a larger model 38 .…”
Section: Model Selection With the Wcrmentioning
confidence: 99%
See 2 more Smart Citations
“…The RCSIC is then equivalent to the RCIC. Under the situations with constant cluster size (n) considered, the second and third terms of the RCSIC perspicuously reduces to a model dimension p multiplied by a cluster size penalty term (n+1) for the model selection criterion 32 . Since the RCSIC puts a larger penalty on model dimension p than the traditional Akaike information criterion, 37 the RCSIC can reduce risks for choosing an overfitted model, in comparison with the traditional Akaike information criterion that is known to easily prefer selecting a larger model 38 .…”
Section: Model Selection With the Wcrmentioning
confidence: 99%
“…Under the situations with constant cluster size (n) considered, the second and third terms of the RCSIC perspicuously reduces to a model dimension p multiplied by a cluster size penalty term (n þ 1) for the model selection criterion. 32 Since the RCSIC puts a larger penalty on model dimension p than the traditional Akaike information criterion, 37 the RCSIC can reduce risks for choosing an overfitted model, in comparison with the traditional Akaike information criterion that is known to easily prefer selecting a larger model. 38 Our simulations below show comparison results where, in terms of the clustered survival data, it seems that the proposed RCSIC can also reduce risks for selecting an overfitted model by the existing model selection method based on the Akaike information criterion in survival analysis, AIC, 39 regardless of whether cluster size is informative or not.…”
Section: Parameter Estimation With Informative Cluster Sizementioning
confidence: 99%
See 1 more Smart Citation
“…The influence of clustering and informative cluster size on inference is well documented, and methods exist for valid inference on marginal parameters in a variety of settings (Benhin et al., 2005; Hoffman et al., 2001; Huang & Leroux, 2011; Pavlou et al., 2013; Seaman et al., 2014; Shen & Chen, 2018; Williamson et al., 2003, 2008). By contrast, the impact of clustering and informative cluster size in prediction modeling has not been closely examined.…”
Section: Introductionmentioning
confidence: 99%