2014
DOI: 10.1002/sim.6322
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble learning of inverse probability weights for marginal structural modeling in large observational datasets

Abstract: Inverse probability weights used to fit marginal structural models are typically estimated using logistic regression. However a data-adaptive procedure may be able to better exploit information available in measured covariates. By combining predictions from multiple algorithms, ensemble learning offers an alternative to logistic regression modeling to further reduce bias in estimated marginal structural model parameters. We describe the application of two ensemble learning approaches to estimating stabilized w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
48
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 58 publications
(50 citation statements)
references
References 27 publications
1
48
0
Order By: Relevance
“…The use of optimal study design and statistical approach are critical elements in population‐based studies, like the present work. In order to control for time‐varying confounders when studying time‐varying exposures, marginal structural model (MSM) is considered the preferred method . Given the complex nature of drug exposure measurements in our study, we applied a time‐dependent Cox model.…”
Section: Discussionmentioning
confidence: 99%
“…The use of optimal study design and statistical approach are critical elements in population‐based studies, like the present work. In order to control for time‐varying confounders when studying time‐varying exposures, marginal structural model (MSM) is considered the preferred method . Given the complex nature of drug exposure measurements in our study, we applied a time‐dependent Cox model.…”
Section: Discussionmentioning
confidence: 99%
“…Other than the aforementioned works that follow the Young et al . framework , there exit a number of studies that have proposed various other algorithms of simulating data suitable for fitting MSCMs .…”
Section: Methodsmentioning
confidence: 99%
“…Instead multiple different standard machine-learning algorithms (e.g., logistic regression, random forest, support vector machine, naïve Bayesian classifier, artificial neural network, etc.) are used to predict unobserved values [86] and to estimate IPW weights [37]. These diverse predictions are then combined via weighted averaging, where the weights reflect how well each algorithm predicts known values that have been deliberately excluded (held out for test purposes) from the data supplied to the algorithms -the computational statistical technique known as model cross validation.…”
Section: Changes In Causes Make Future Effects Different From What Thmentioning
confidence: 99%