2017
DOI: 10.1080/10618600.2015.1110526
|View full text |Cite
|
Sign up to set email alerts
|

A Marginal Sampler for σ-Stable Poisson–Kingman Mixture Models

Abstract: We investigate the class of σ-stable Poisson-Kingman random probability We investigate the class of σ-stable Poisson-Kingman random probability measures (RPMs) in the context of Bayesian nonparametric mixture modeling. This is a large class of discrete RPMs which encompasses most of the popular discrete RPMs used in Bayesian nonparametrics, such as the Dirichlet process, Pitman-Yor process, the normalized inverse Gaussian process and the normalized generalized Gamma process. We show how certain sampling proper… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 54 publications
0
7
0
Order By: Relevance
“…Even for small N , this number is very large, which makes computation of the posterior intractable for the simplest choice of prior and likelihood. Thus, MCMC techniques are typically employed, such as the marginal samplers described by Neal [2000] with extensions in Favaro and Teh [2013] for normalized completely random measures and in Lomellí et al [2016] for σ-stable Poisson-Kingman models; the conditional samplers described in Ishwaran and James [2001], Papaspiliopoulos and Roberts [2008], or Kalli et al [2011], with extensions in Favaro and Teh [2013] for normalized completely random measures and in Favaro and Walker [2012] for σ-stable Poisson-Kingman models; or the recently introduced class of hybrid samplers for σ-stable Poisson-Kingman models in Lomellí et al [2015]. These algorithms produce approximate samples (c m ) M m=1 from the posterior (1).…”
Section: Bayesian Nonparametric Clusteringmentioning
confidence: 99%
“…Even for small N , this number is very large, which makes computation of the posterior intractable for the simplest choice of prior and likelihood. Thus, MCMC techniques are typically employed, such as the marginal samplers described by Neal [2000] with extensions in Favaro and Teh [2013] for normalized completely random measures and in Lomellí et al [2016] for σ-stable Poisson-Kingman models; the conditional samplers described in Ishwaran and James [2001], Papaspiliopoulos and Roberts [2008], or Kalli et al [2011], with extensions in Favaro and Teh [2013] for normalized completely random measures and in Favaro and Walker [2012] for σ-stable Poisson-Kingman models; or the recently introduced class of hybrid samplers for σ-stable Poisson-Kingman models in Lomellí et al [2015]. These algorithms produce approximate samples (c m ) M m=1 from the posterior (1).…”
Section: Bayesian Nonparametric Clusteringmentioning
confidence: 99%
“…Although the PD(α, θ) class of models dominates the broad literature, there has been significant interest in the general class of Gibbs partitions. Here we note a few examples in [5,18,23,35,38,39,55,79]. Our exposition takes another viewpoint of this general class as we begin to describe next.…”
Section: Preliminaries On Poisson-kingman Distributions and Gibbs Par...mentioning
confidence: 99%
“…Gibbs-type species sampling models have been also applied in the context of mixture modeling, thus generalizing the seminal work by Lo [47]. See, e.g., Ishwaran and James [34], Lijoi et al [41], Lijoi et al [42], Favaro and Walker [23] and Lomeli et al [49]. While maintaining the same computational tractability of the Dirichlet process mixture model, the availability of the additional parameter α allows for a better control of the clustering behaviour.…”
Section: A Brief Review Of Gibbs-type Priorsmentioning
confidence: 99%