2015
DOI: 10.48550/arxiv.1503.05571
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

GSNs : Generative Stochastic Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 0 publications
0
6
0
Order By: Relevance
“…The use of deep neural networks as generative models for complex data has made great advances in recent years. This success has been achieved through a surprising diversity of training losses and model architectures, including denoising autoencoders (Vincent et al, 2010), variational autoencoders (Kingma & Welling, 2013;Rezende et al, 2014;Gregor et al, 2015;Kulkarni et al, 2015;Burda et al, 2015;Kingma et al, 2016), generative stochastic networks (Alain et al, 2015), diffusion probabilistic models (Sohl-Dickstein et al, 2015), autoregressive models (Theis & Bethge, 2015;van den Oord et al, 2016a;, real non-volume preserving transformations (Dinh et al, 2014;, Helmholtz machines (Dayan et al, 1995;Bornschein et al, 2015), and Generative Adversarial Networks (GANs) (Goodfellow et al, 2014).…”
Section: Introductionmentioning
confidence: 99%
“…The use of deep neural networks as generative models for complex data has made great advances in recent years. This success has been achieved through a surprising diversity of training losses and model architectures, including denoising autoencoders (Vincent et al, 2010), variational autoencoders (Kingma & Welling, 2013;Rezende et al, 2014;Gregor et al, 2015;Kulkarni et al, 2015;Burda et al, 2015;Kingma et al, 2016), generative stochastic networks (Alain et al, 2015), diffusion probabilistic models (Sohl-Dickstein et al, 2015), autoregressive models (Theis & Bethge, 2015;van den Oord et al, 2016a;, real non-volume preserving transformations (Dinh et al, 2014;, Helmholtz machines (Dayan et al, 1995;Bornschein et al, 2015), and Generative Adversarial Networks (GANs) (Goodfellow et al, 2014).…”
Section: Introductionmentioning
confidence: 99%
“…Walkback and variational walkback Contrastive adjustment is closely related to the walkback algorithm in Generative Stochastic Networks (GSNs) [1]. GSNs form a Markov chain by alternating sampling from a corruption process and a denoising distribution.…”
Section: Related Workmentioning
confidence: 99%
“…As a result, maximum reconstruction estimation (Hinton, Osindero, and Teh 2006;Vincent et al 2008;Bengio 2009;Vincent et al 2010;Socher et al 2011;Ammar, Dyer, and Smith 2014;Alain et al 2015) aims to find a set of parameters that maximize the product of reconstruction probabilities of the training data:…”
Section: Maximum Reconstruction Estimationmentioning
confidence: 99%
“…In this work, we introduce maximum reconstruction estimation (MRE) (Hinton, Osindero, and Teh 2006;Vincent et al 2008;Bengio 2009;Vincent et al 2010;Socher et al 2011;Ammar, Dyer, and Smith 2014;Alain et al 2015) for learning generative latent-variable models. The basic idea is to circumvent irrelevant but common correlations by maximizing the probability of reconstructing observed data.…”
Section: Introductionmentioning
confidence: 99%