2012
DOI: 10.1109/tpami.2011.199
|View full text |Cite
|
Sign up to set email alerts
|

Model-Based Learning Using a Mixture of Mixtures of Gaussian and Uniform Distributions

Abstract: We introduce a mixture model whereby each mixture component is itself a mixture of a multivariate Gaussian distribution and a multivariate uniform distribution. Although this model could be used for model-based clustering (model-based unsupervised learning) or model-based classification (model-based semi-supervised learning), we focus on the more general model-based classification framework. In this setting, we fit our mixture models to data where some of the observations have known group memberships and the g… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
45
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 67 publications
(45 citation statements)
references
References 23 publications
0
45
0
Order By: Relevance
“…This also includes De Angelis et al [16] who offered a robust time interval measurement method based on a Gaussian-uniform mixture model. Browne et al [17] incorporated a multivariate Gaussian and uniform distributions as the component density, which allowed for superior mixture possessing. These methods demonstrate a competitive performance in fitting different shapes of observed data.…”
Section: (1) Schemes Based On Markov Random Field (Mrf)mentioning
confidence: 99%
“…This also includes De Angelis et al [16] who offered a robust time interval measurement method based on a Gaussian-uniform mixture model. Browne et al [17] incorporated a multivariate Gaussian and uniform distributions as the component density, which allowed for superior mixture possessing. These methods demonstrate a competitive performance in fitting different shapes of observed data.…”
Section: (1) Schemes Based On Markov Random Field (Mrf)mentioning
confidence: 99%
“…We used GMM (Gaussian Mixture Model) [46][51] [4] to define the visual code words from the descriptor vectors, which is a parametric probability density function represented as a weighted sum of (in our case 256) Gaussian component densities, as can be seen in Equation 1.…”
Section: Image Representationmentioning
confidence: 99%
“…Hennig (2004) suggests adding an improper uniform distribution as an additional mixture component. Browne, McNicholas, and Sparling (2012) also make use of uniform distributions but they do so by making each component a mixture of a Gaussian and a uniform distribution. Rather than specifically accommodating bad points, this approach allows for what they call "bursts" of probability as well as locally heavier tails-this might have the effect of dealing with bad points for some data sets.…”
Section: Robust Clusteringmentioning
confidence: 99%