2017 International Joint Conference on Neural Networks (IJCNN) 2017
DOI: 10.1109/ijcnn.2017.7966329
|View full text |Cite
|
Sign up to set email alerts
|

Generative mixture of networks

Abstract: A generative model based on training deep architectures is proposed. The model consists of K networks that are trained together to learn the underlying distribution of a given data set. The process starts with dividing the input data into K clusters and feeding each of them into a separate network. After few iterations of training networks separately, we use an EM-like algorithm to train the networks together and update the clusters of the data. We call this model Mixture of Networks. The provided model is a p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 20 publications
0
3
0
Order By: Relevance
“…The concept of GANs can help address this concern. Generative models aim to estimate the probability distribution of the training data and generate samples that belong to the same data distribution manifold [111]. GANbased semi-supervised method architectures for the regression task proposed recently [112] strengthen the possibilities of applying GAN to the regression tasks of digital circuits.…”
Section: Rtl Levelmentioning
confidence: 99%
“…The concept of GANs can help address this concern. Generative models aim to estimate the probability distribution of the training data and generate samples that belong to the same data distribution manifold [111]. GANbased semi-supervised method architectures for the regression task proposed recently [112] strengthen the possibilities of applying GAN to the regression tasks of digital circuits.…”
Section: Rtl Levelmentioning
confidence: 99%
“…N-EM trains the parameters of EM using a neural network, which derives a differentiable clustering model, and is used for unsupervised segmentation, where N-EM can cluster constituent objects. Banijamali et al [2] use generative mixture of networks (GMN) to simulate the GMM. They first use K-means to obtain prior knowledge of the dataset, and then treat each network as a cluster.…”
Section: Related Workmentioning
confidence: 99%
“…We make the following contributions: (1) We are the first to achieve general EM process using GAN by introducing a novel GAN based EM learning framework (GAN-EM) that is able to perform clustering, semi-supervised classification and dimensionality reduction; (2) We conduct thoughtful experiments and show that our GAN-EM achieves state-of-the-art clustering results on MNIST and CelebA datasets, and semi-supervised classification results on SVHN and CelebA.…”
Section: Introductionmentioning
confidence: 99%