Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.577
|View full text |Cite
|
Sign up to set email alerts
|

Don’t Go Far Off: An Empirical Study on Neural Poetry Translation

Abstract: Despite constant improvements in machine translation quality, automatic poetry translation remains a challenging problem due to the lack of open-sourced parallel poetic corpora, and to the intrinsic complexities involved in preserving the semantics, style and figurative nature of poetry. We present an empirical investigation for poetry translation along several dimensions: 1) size and style of training data (poetic vs. non-poetic), including a zeroshot setup; 2) bilingual vs. multilingual learning; and 3) lang… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 44 publications
0
4
0
Order By: Relevance
“…They have been shown to encode a wealth of human knowledge [27,36,58] and can perform sophisticated reasoning [34,52,77] and action planning [27]. Their linguistic and storytelling capabilities have been utilized in creative writing [15,20,51,83] and a myriad of other creative applications [11,[42][43][44]76]. Moreover, LLMs can adapt to new tasks based on a given description without re-training, a method known as prompting.…”
Section: Large Language Models and Agentsmentioning
confidence: 99%
See 1 more Smart Citation
“…They have been shown to encode a wealth of human knowledge [27,36,58] and can perform sophisticated reasoning [34,52,77] and action planning [27]. Their linguistic and storytelling capabilities have been utilized in creative writing [15,20,51,83] and a myriad of other creative applications [11,[42][43][44]76]. Moreover, LLMs can adapt to new tasks based on a given description without re-training, a method known as prompting.…”
Section: Large Language Models and Agentsmentioning
confidence: 99%
“…As AI continues to advance in its capability to generate content and automate tasks, it is being increasingly incorporated into the creative processes across various domains [15,20,26,47,48,51,51,65,71,83]. This includes areas such as story writing [15,20,51,83], music composition [26,47,48], comic creation [65], and game design [86]. For instance, TaleBrush [20] enables users to craft stories with the support of language models by sketching storylines metaphorically.…”
Section: Human-ai Co-creationmentioning
confidence: 99%
“…Large language models (LLMs) have made significant strides in NLP tasks such as reading comprehension (Chakrabarty et al 2022). These LLMs have expanded their capabilities by incorporating richer corpus information and utilizing sophisticated pre-training techniques (Zhang et al 2021;Wang et al 2023b).…”
Section: Contextual Dynamic Representationmentioning
confidence: 99%
“…Advancing in this domain, Ghazvininejad et al (2018) introduced an approach for automatic poetry translation that preserves target rhythm and rhyme patterns, utilizing neural translation techniques to improve translation quality while adhering to specified constraints. Chakrabarty et al (2021) recognized the complexities of automatic poetry translation due to semantic, stylistic, and figurative language preservation challenges, noting the effectiveness of multilingual finetuning on poetic text. Ma and Wang (2020) introduced a linguistic framework for analyzing and contrasting poetry translations, albeit not aimed at contrasting human and machine translation.…”
Section: Machine Translation Of Poetrymentioning
confidence: 99%