2017
DOI: 10.1080/01621459.2016.1255636
|View full text |Cite
|
Sign up to set email alerts
|

Mixture Models With a Prior on the Number of Components

Abstract: A natural Bayesian approach for mixture models with an unknown number of components is to take the usual finite mixture model with symmetric Dirichlet weights, and put a prior on the number of components—that is, to use a mixture of finite mixtures (MFM). The most commonly-used method of inference for MFMs is reversible jump Markov chain Monte Carlo, but it can be nontrivial to design good reversible jump moves, especially in high-dimensional spaces. Meanwhile, there are samplers for Dirichlet process mixture … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
301
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
6
1
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 187 publications
(304 citation statements)
references
References 98 publications
2
301
0
Order By: Relevance
“…A well-known and widely used method is the reversible Markov chain Monte Carlo (Green, 1995) which, due to its non-trivial set up, has led to the search of alternatives. A recent and interesting one is proposed by Miller and Harrison (2018), where the model in (1) is written as…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…A well-known and widely used method is the reversible Markov chain Monte Carlo (Green, 1995) which, due to its non-trivial set up, has led to the search of alternatives. A recent and interesting one is proposed by Miller and Harrison (2018), where the model in (1) is written as…”
Section: Introductionmentioning
confidence: 99%
“…Throughout the paper we will adopt, for the weights and the component parameters, the priors proposed in Miller and Harrison (2018); this will not affect the analysis of the results and the comparisons among different priors for k.…”
Section: Introductionmentioning
confidence: 99%
“…Clustering models partition entities into mutually exclusive latent groups (clusters). Numerous methods have been developed, including algorithm-based approaches such as kmeans, and model-based clustering methods such as finite mixture models (Richardson & Green, 1997;Miller & Harrison, 2018) and infinite mixture models (Lau & Green, 2007;Favaro & Teh, 2013;Barrios et al , 2013, for example). Partitioning symptoms, similar to graphical models, may discover latent diseases that are related to subsets of symptoms whereas clustering patients suggests latent diseases that are shared among groups of patients.…”
Section: Introductionmentioning
confidence: 99%
“…We can either add another hierarchy for Bayesian dewarping or develop regression models for {λ * } to incorporate disease phenotype information or other covariates, e.g, age and gender to refine disease subsetting. Second, multiple autoantibodies produced against a particular Z or extracted continuous intensity shape information for each landmark and lane either by regularization or using shrinkage priors in a Bayesian framework for encouraging few and maximally different complexes (e.g., Broderick and others, 2013;Miller and Harrison, 2015). Our preliminary results (not shown here) show good subset and signature estimation performance.…”
Section: Discussionmentioning
confidence: 66%