Proceedings of the 5th Joint SIGHUM Workshop on Computational Linguistics for Cultural Heritage, Social Sciences, Humanities An 2021
DOI: 10.18653/v1/2021.latechclfl-1.7
|View full text |Cite
|
Sign up to set email alerts
|

End-to-end style-conditioned poetry generation: What does it take to learn from examples alone?

Abstract: In this work, we design an end-to-end model for poetry generation based on conditioned recurrent neural network (RNN) language models whose goal is to learn stylistic features (poem length, sentiment, alliteration, and rhyming) from examples alone. We show this model successfully learns the 'meaning' of length and sentiment, as we can control it to generate longer or shorter as well as more positive or more negative poems. However, the model does not grasp sound phenomena like alliteration and rhyming, but ins… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 32 publications
0
1
0
Order By: Relevance
“…A popular trend is to investigate the extent to which models can be trained to generate language forms where training data is scarce. Wöckener et al (2021) investigate this for the generation of poetry using ∼16k and ∼67k quatrains of English and German poetry respectively, and notice difficulties in GPT-2 learning sub-lexical phenomena including rhyme from this number of training examples alone. However, poetry presents a highly restrictive form of literary language where many forms contain formal constraints regarding length, syllable count, and metrical patterns.…”
Section: Creative Language Generationmentioning
confidence: 99%
“…A popular trend is to investigate the extent to which models can be trained to generate language forms where training data is scarce. Wöckener et al (2021) investigate this for the generation of poetry using ∼16k and ∼67k quatrains of English and German poetry respectively, and notice difficulties in GPT-2 learning sub-lexical phenomena including rhyme from this number of training examples alone. However, poetry presents a highly restrictive form of literary language where many forms contain formal constraints regarding length, syllable count, and metrical patterns.…”
Section: Creative Language Generationmentioning
confidence: 99%
“…Evaluation is done over word pairs from the CMU pronunciation dictionary, using rules similar to ours to determine ground truth; on this task, their system reaches 0.91 F 1 -score. Wöckener et al (2021) trained an end-to-end unidirectional word-level RNN on quatrains from the Chicago Rhyming Poetry Corpus. The RNN obeys user-specified constraints such as rhyme, alliteration, sentiment, text length, and time period.…”
Section: Related Workmentioning
confidence: 99%
“…While their factual accuracy is still open to debate, this is not an issue when using LMs with a creative purpose, in particular to generate works of art such as poems. In the recent past, LMs were put to use for poetry generation in several studies (Hopkins and Kiela, 2017;Lau et al, 2018;Van de Cruys, 2020;Wöckener et al, 2021;Uthus et al, 2022;Ormazabal et al, 2022), which found that fluency and intelligibility reached satisfactory levels. However, poems often exhibit structural, text-level properties that are still quite difficult to manage by LMs: rhyming patterns and division into verses and stanzas.…”
Section: Introductionmentioning
confidence: 99%