Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2022
DOI: 10.18653/v1/2022.naacl-main.34
|View full text |Cite
|
Sign up to set email alerts
|

Towards Robust and Semantically Organised Latent Representations for Unsupervised Text Style Transfer

Abstract: Recent studies show that auto-encoder based approaches successfully perform language generation, smooth sentence interpolation, and style transfer over unseen attributes using unlabelled datasets in a zero-shot manner. The latent space geometry of such models is organised well enough to perform on datasets where the style is "coarse-grained" i.e. a small fraction of words alone in a sentence are enough to determine the overall style label. A recent study uses a discrete token-based perturbation approach to map… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 17 publications
0
1
0
Order By: Relevance
“…Recent work focuses on various common paradigms such as disentanglement (Hu et al, 2017;Shen et al, 2017), cycle-consistency losses (Yi et al, 2020;Luo et al, 2019;Dai et al, 2019;Liu et al, 2021), induction (Narasimhan et al, 2022;Shen et al, 2020). Jin et al (2021) and Hu et al (2020) provide surveys detailing the current state of style transfer and lay down useful taxonomies to structure the field.…”
Section: Related Workmentioning
confidence: 99%
“…Recent work focuses on various common paradigms such as disentanglement (Hu et al, 2017;Shen et al, 2017), cycle-consistency losses (Yi et al, 2020;Luo et al, 2019;Dai et al, 2019;Liu et al, 2021), induction (Narasimhan et al, 2022;Shen et al, 2020). Jin et al (2021) and Hu et al (2020) provide surveys detailing the current state of style transfer and lay down useful taxonomies to structure the field.…”
Section: Related Workmentioning
confidence: 99%
“…Recent work focuses on various common paradigms such as disentanglement , cycle-consistency losses Luo et al, 2019;, induction (Narasimhan et al, 2022;. Jin et al (2021) and provide surveys detailing the current state of style transfer and lay down useful taxonomies to structure the field.…”
Section: Related Workmentioning
confidence: 99%
“…the style-transferred output is not available. Past work focuses on various common paradigms such as disentanglement (Hu et al, 2017;Shen et al, 2017), cycle-consistency losses (Yi et al, 2020;Luo et al, 2019;Dai et al, 2019;Liu et al, 2021), induction (Narasimhan et al, 2022;Shen et al, 2020) etc. We focus on a sentence to sentence "transduction" (or prototype editing) method, a solution which naturally emerges when following a probabilistic formulation consisting of a single transduction model with a latent prior consisting of a style-absent corpus.…”
Section: Introductionmentioning
confidence: 99%