2016
DOI: 10.1051/eas/1677005
|View full text |Cite
|
Sign up to set email alerts
|

Supervised and Unsupervised Classification Using Mixture Models

Abstract: This chapter is dedicated to model-based supervised and unsupervised classification. Probability distributions are defined over possible labels as well as over the observations given the labels. To this end, the basic tools are the mixture models. This methodology yields a posterior distribution over the labels given the observations which allows to quantify the uncertainty of the classification. The role of Gaussian mixture models is emphasized leading to Linear Discriminant Analysis and Quadratic Discriminan… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…The choice of the best statistical DLM model and the optimum number of clusters 𝐾 depends on the data and was here estimated with the integrated completed likelihood (ICL) criterion. This criterion penalises the likelihood by the number of parameters of the statistical model, the number of observations, and favours well-separated clusters (Biernacki et al 2000;Girard & Saracco 2016).…”
Section: The Fisher-em Algorithmmentioning
confidence: 99%
“…The choice of the best statistical DLM model and the optimum number of clusters 𝐾 depends on the data and was here estimated with the integrated completed likelihood (ICL) criterion. This criterion penalises the likelihood by the number of parameters of the statistical model, the number of observations, and favours well-separated clusters (Biernacki et al 2000;Girard & Saracco 2016).…”
Section: The Fisher-em Algorithmmentioning
confidence: 99%