2010
DOI: 10.1016/j.patcog.2009.08.006
|View full text |Cite
|
Sign up to set email alerts
|

Parsimonious reduction of Gaussian mixture models with a variational-Bayes approach

Abstract: International audienceAggregating statistical representations of classes is an important task for current trends in scaling up learning and recognition, or for addressing them in distributed in- frastructures. In this perspective, we address the problem of merging probabilistic Gaus- sian mixture models in an efficient way, through the search for a suitable combination of components from mixtures to be merged. We propose a new Bayesian modelling of this combination problem, in association to a variational estima… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 42 publications
(18 citation statements)
references
References 21 publications
(29 reference statements)
0
18
0
Order By: Relevance
“…This figure shows an example of P SS application on the node 1. Its neighbor list is initially set to [2,3,4,5,6,10]. First, this node selects randomly the node to exchange half of its neighbor list.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…This figure shows an example of P SS application on the node 1. Its neighbor list is initially set to [2,3,4,5,6,10]. First, this node selects randomly the node to exchange half of its neighbor list.…”
Section: Methodsmentioning
confidence: 99%
“…Then, both nodes update their neighbor list, keeping only the new nodes. Figure (b) shows that the list of node 1 is set to [2,3,5,7,9,10] while list of node 6 is set to [1,4,5,8,10,11]. The dashed arrows represent the changes in the topology of the network.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition to the population-based stochastic search techniques, alternative approaches to the basic EM algorithm also include methods for reducing the complexity of a GMM by trying to estimate the number of components [17,18] or by forcing a hierarchical structure [19,20]. This paper focuses on the conventional problem with a fixed number of components in the mixture.…”
Section: Related Workmentioning
confidence: 99%
“…The initial mixture weights were used only in the EM procedure as the proposed algorithm does not include the weights as parameters. After initialization, the search procedure constrained the components of the mean vectors in each particle defined in (20) to stay in the data region defined by the minimum and maximum values of each component in the data used for estimation. Similarly, the eigenvalues were constrained to stay in ½l min ,l max where l min ¼ 10 À5 and l max was the maximum eigenvalue of the covariance matrix of the whole data, and the Givens rotation angles were constrained to lie in ½Àp=4; 3p=4.…”
Section: Experiments On Synthetic Datamentioning
confidence: 99%