1989
DOI: 10.2307/2982840
|View full text |Cite
|
Sign up to set email alerts
|

Mixture Models: Inference and Applications to Clustering.

Abstract: 17. Mixture Models: Inference and Applications to Clustering (Statistics: Textbooks and Monographs Series, Vol. 84). By G. J. McLachlan and K. E. Basford. Dekker, New York, 1988. xii + 254 pp. $83.50.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 18 publications
(11 citation statements)
references
References 0 publications
0
7
0
Order By: Relevance
“…The GMM is a probabilistic model that assumes data are generated from a finite set of Gaussian distributions. Gaussian mixture probability density is the weighted sum of k component Gaussian densities [ 30 ]. The GMM can address correlation between attributes by selecting the optimal covariance matrix for each cluster and has been used in previous behavioral clustering problems [ 31 ].…”
Section: Methodsmentioning
confidence: 99%
“…The GMM is a probabilistic model that assumes data are generated from a finite set of Gaussian distributions. Gaussian mixture probability density is the weighted sum of k component Gaussian densities [ 30 ]. The GMM can address correlation between attributes by selecting the optimal covariance matrix for each cluster and has been used in previous behavioral clustering problems [ 31 ].…”
Section: Methodsmentioning
confidence: 99%
“…The GMM is a probabilistic model that assumes data is generated from a finite set of Gaussian distributions. A Gaussian mixture probability density is the weighted sum of k component Gaussian densities [26]. In this study, we used the GMM implementation from the scikit-learn package in Python to obtain a clustering model for the mobile sensing data [27].…”
Section: Gaussian Mixture Modelmentioning
confidence: 99%
“…However, as in k-means, one needs to specify the number of clusters/components a priori in GMM. The optimal value can be determined by minimizing the Bayesian Information Criterion (BIC) [48] metric which considers both covariance type and the number of components. The BIC is a penalty term for the possible likelihood increment when adding more parameters into the model.…”
Section: Stage Ii: Clustering Featurized Structuresmentioning
confidence: 99%