2020
DOI: 10.48550/arxiv.2004.14758
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Preventing Posterior Collapse with Levenshtein Variational Autoencoder

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…We propose using a random schedule for the variable β in order to reduce the potential effects that could be caused by the posterior collapse problem in VAEs (Lucas et al, 2019 ; Havrylov and Titov, 2020 ; Takida et al, 2021 ) and to maintain a balance with the reconstruction loss. We take a sample from a uniform distribution for every example that we go through in the training process.…”
Section: Methodsmentioning
confidence: 99%
“…We propose using a random schedule for the variable β in order to reduce the potential effects that could be caused by the posterior collapse problem in VAEs (Lucas et al, 2019 ; Havrylov and Titov, 2020 ; Takida et al, 2021 ) and to maintain a balance with the reconstruction loss. We take a sample from a uniform distribution for every example that we go through in the training process.…”
Section: Methodsmentioning
confidence: 99%
“…The posterior collapse effect is a known problem of shallow VAEs when some or even all of the latent variables do not carry any information about the observed data. There are various methods to deal with this issue for VAEs, such as changing the parameterization [Dieng et al, 2019, He et al, 2019, changing the optimization or the objective [Alemi et al, 2018, Bowman et al, 2016, Fu et al, 2019, Havrylov and Titov, 2020, Razavi et al, 2019, or using hierarchical models [Child, 2021, Maaløe et al, 2017, Tomczak and Welling, 2018, Vahdat and Kautz, 2020b. Here, we focus entirely on the hierarchical VAEs since the posterior collapse problem is not fully analyzed in their context.…”
Section: An Analysis Of the Posterior Collapse In Hierarchical Vaesmentioning
confidence: 99%
“…Kingma et al [13] and Pelsmaeker and Aziz [19] proposed to introduce constraints to force the divergence term to be greater than a hyperparameter. Finally, alternative surrogate objectives to train VAEs in the context of text generation have been proposed [16,9].…”
Section: Introductionmentioning
confidence: 99%