Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 2021
DOI: 10.18653/v1/2021.findings-acl.170
|View full text |Cite
|
Sign up to set email alerts
|

An Evaluation of Disentangled Representation Learning for Texts

Abstract: Learning disentangled representations of texts, which encode information pertaining to different aspects of the text in separate representations, is an active area of research in NLP for controllable and interpretable text generation. These methods have, for the most part, been developed in the context of text style transfer, but are limited in their evaluation. In this work, we look at the motivation behind learning disentangled representations of content and style for texts and at the potential use-cases whe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 19 publications
0
2
0
Order By: Relevance
“…As PDNC consists of multiple novels, it enables crossdomain evaluation, such that the model is evaluated on the test set that has no overlapped novels from the training. It is a more practice scenario, which is also adopted by PDNC authors (Vishnubhotla, Hammond, and Hirst 2022;Vishnubhotla et al 2023), as the trained model should be able to recognize speakers upon any literary, rather than only for those limited novels seen during training.…”
Section: Evaluation Protocolmentioning
confidence: 99%
See 1 more Smart Citation
“…As PDNC consists of multiple novels, it enables crossdomain evaluation, such that the model is evaluated on the test set that has no overlapped novels from the training. It is a more practice scenario, which is also adopted by PDNC authors (Vishnubhotla, Hammond, and Hirst 2022;Vishnubhotla et al 2023), as the trained model should be able to recognize speakers upon any literary, rather than only for those limited novels seen during training.…”
Section: Evaluation Protocolmentioning
confidence: 99%
“…BookNLP+ In a complementary effort, Vishnubhotla et al (2023) enhances BookNLP by constraining the set of candidates to resolve mention spans stemming from the coreference resolution step. This refined approach directly establishes a link between quotations and entities, achieving stateof-the-art results for the cross-domain evaluation on PDNC.…”
Section: Experimental Approachesmentioning
confidence: 99%
“…One avenue to explore for debiasing patient representations is disentanglement. Disentangled representations encode information pertaining to different aspects of text in separate representations [52]. The most popular form of testing text disentanglement is to create separate representations for a text's semantic and syntactic information.…”
Section: Future Workmentioning
confidence: 99%
“…Multiple researchers have noted the need for supervision at every step of the process when disentangling text [52,14]. However, some work has managed to use weak supervision [32], and one study creates fully unsupervised disentangled text representations [31] that perform well on their downstream task.…”
Section: Future Workmentioning
confidence: 99%