2000
DOI: 10.1080/10618600.2000.10474879
|View full text |Cite
|
Sign up to set email alerts
|

Markov Chain Sampling Methods for Dirichlet Process Mixture Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
1,329
0
5

Year Published

2014
2014
2019
2019

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 1,493 publications
(1,362 citation statements)
references
References 15 publications
1
1,329
0
5
Order By: Relevance
“…Common control design for cooperative spectrum prediction in a multi-primary, user's environment is yet to develop in the spectrum prediction literature. Analysis of cooperative prediction using hierarchical Dirichlet processes is an interesting proposal to model cooperative spectrum prediction, that is not explored in SOP literature [40].…”
Section: Cooperation and Contentionmentioning
confidence: 99%
See 2 more Smart Citations
“…Common control design for cooperative spectrum prediction in a multi-primary, user's environment is yet to develop in the spectrum prediction literature. Analysis of cooperative prediction using hierarchical Dirichlet processes is an interesting proposal to model cooperative spectrum prediction, that is not explored in SOP literature [40].…”
Section: Cooperation and Contentionmentioning
confidence: 99%
“…Moreover, mixture models are more complex but allow empirical measurements-based source estimation. An example would be Dirichlet mixture process which often used to generate mixture prior distributions, but tracing convergence bounds becomes increasingly difficult for nonGaussian mixtures for example [20,28,40]. Convergence bounds are calculated only for limited Bayesian mixture class/prior distribution pairs (for example, uniform prior/Epanchinkov kernel) [107].…”
Section: Validity and Complexitymentioning
confidence: 99%
See 1 more Smart Citation
“…The goal state is an assignment that results in an acceptable clustering based on a clustering criterion such as m.s.e. A variety of inference methods based on Gibbs sampling (which is an MC sampling algorithm) have been proposed for inference in DPMM [23]. The collapsed Gibbs sampling algorithm (algorithm 3 in [23]) is suitable for our use of DPMM for clustering as we are only interested in knowing the cluster assignments (z i 's) and not the actual cluster parameters (θ k 's).…”
Section: (D) Clusteringmentioning
confidence: 99%
“…A variety of inference methods based on Gibbs sampling (which is an MC sampling algorithm) have been proposed for inference in DPMM [23]. The collapsed Gibbs sampling algorithm (algorithm 3 in [23]) is suitable for our use of DPMM for clustering as we are only interested in knowing the cluster assignments (z i 's) and not the actual cluster parameters (θ k 's). It is an iterative algorithm that in each iteration updates the values of z i for each data point one at a time.…”
Section: (D) Clusteringmentioning
confidence: 99%