2018
DOI: 10.1016/j.csda.2018.05.020
|View full text |Cite
|
Sign up to set email alerts
|

Mode jumping MCMC for Bayesian variable selection in GLMM

Abstract: Generalized linear mixed models (GLMM) are used for inference and prediction in a wide range of different applications providing a powerful scientific tool. An increasing number of sources of data are becoming available, introducing a variety of candidate explanatory variables for these models. Selection of an optimal combination of variables is thus becoming crucial. In a Bayesian setting, the posterior distribution of the models, based on the observed data, can be viewed as a relevant measure for the model e… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
40
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 21 publications
(43 citation statements)
references
References 61 publications
1
40
0
Order By: Relevance
“…In case of different groups of covariates being relevant, it is necessary for the sampler to visit the different modes corresponding to each of these groups in order for the proposed approach to be able to correctly identify the different groups in the postprocessing step. Alternative sampling approaches for BMA with a better mixing behavior, such as those proposed in Clyde et al (2011) and Hubin and Storvik (2018), might thus be preferable to be used in case of groups of relevant covariates being suspected. The impact of their use should be investigated in future work.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In case of different groups of covariates being relevant, it is necessary for the sampler to visit the different modes corresponding to each of these groups in order for the proposed approach to be able to correctly identify the different groups in the postprocessing step. Alternative sampling approaches for BMA with a better mixing behavior, such as those proposed in Clyde et al (2011) and Hubin and Storvik (2018), might thus be preferable to be used in case of groups of relevant covariates being suspected. The impact of their use should be investigated in future work.…”
Section: Discussionmentioning
confidence: 99%
“…The posterior model weights for the visited models may then only be determined based on the marginal likelihood and the prior probabilities of the specified model prior, while the empirical frequencies of the visited models are not suitable measures. Further alternative sampling schemes with better mixing have also been proposed, see, for example, Clyde et al (2011) and Hubin and Storvik (2018). In the following empirical analysis, only the normalMC3 algorithm is employed.…”
Section: The Dpc Model Priormentioning
confidence: 99%
“…The success of GMJMCMC relies upon resolving the local extrema issue, which is mainly achieved by combining the following two ideas. First, when iterating through a fixed search space S, GMJMCMC utilizes the MJMCMC algorithm (Hubin and Storvik, 2016a) which was specifically constructed to explore multi-modal regression spaces efficiently. Second, the evolution of the search spaces is governed within the framework of a genetic algorithm where a population consists of a finite number of trees forming the current search space.…”
Section: Discussionmentioning
confidence: 99%
“…Clearly the genetic algorithm used to update search spaces results in a Markov chain of model spaces. In the future it will be interesting to generalize the mode jumping ideas from Hubin and Storvik (2016a) to the Markov chain of search spaces, making it converge to the right limiting distribution in the joint space of models, parameters and search spaces, whilst remaining the property of not getting stuck in local modes.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation