Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.94
|View full text |Cite
|
Sign up to set email alerts
|

Acrostic Poem Generation

Abstract: We propose a new task in the area of computational creativity: acrostic poem generation in English. Acrostic poems are poems that contain a hidden message; typically, the first letter of each line spells out a word or short phrase. We define the task as a generation task with multiple constraints: given an input word, 1) the initial letters of each line should spell out the provided word, 2) the poem's semantics should also relate to it, and 3) the poem should conform to a rhyming scheme. We further provide a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 15 publications
0
5
0
Order By: Relevance
“…Qixin et al [24] propose a new two-stage poetry generation method based on planning neural networks. In [25], a baseline model for acrostic poetry generation is proposed, which combines the conditional neural language model and the neural rhyming model to generate acrostic poems with several constraints. Nevertheless, none of these studies concern non-textual modality, they all pay attention to generating poetry from the text input.…”
Section: Poetry Generationmentioning
confidence: 99%
See 1 more Smart Citation
“…Qixin et al [24] propose a new two-stage poetry generation method based on planning neural networks. In [25], a baseline model for acrostic poetry generation is proposed, which combines the conditional neural language model and the neural rhyming model to generate acrostic poems with several constraints. Nevertheless, none of these studies concern non-textual modality, they all pay attention to generating poetry from the text input.…”
Section: Poetry Generationmentioning
confidence: 99%
“…[24] propose a new two‐stage poetry generation method based on planning neural networks. In [25], a baseline model for acrostic poetry generation is proposed, which combines the conditional neural language model and the neural rhyming model to generate acrostic poems with several constraints.…”
Section: Related Workmentioning
confidence: 99%
“…This framework has also been extended to related tasks such as generating poetry from images (Liu et al, 2018), translating poetry (Ghazvininejad et al, 2018), and generating song lyrics given an input melody (Watanabe et al, 2018). A drawback shared by most poetry generation systems, including even very recent ones (Van de Cruys, 2020; Agarwal and Kann, 2020), is that they require hard-and hand-coded rules to generate poetry with specific properties, such as rhyming and alliteration, or to filter output not having these properties. For example, to ensure rhyming, Lau et al (2018) exploit the fact that their data (sonnets) has a particular structure from which they can infer that certain word pairs must rhyme, which they incorporate in the modelling.…”
Section: Poetry Generationmentioning
confidence: 99%
“…Hämäläinen and Alnajjar (2019) define rules for style features. Agarwal and Kann (2020) tell the model when a rhyming word is required and then modify the prediction process. This means that the models themselves lack the ability to discover elementary properties of poetry themselves, but instead rely on the modeller's intention and ingenuity to do so.…”
Section: Poetry Generationmentioning
confidence: 99%
“…In recent years, the emergence of neural architectures and language models like GPT-2 [16] with millions of parameters have resulted in rapid advancements for various NLP tasks. These models has proved efficient in generating artificial poems [56,57], stories, news articles with just a few epochs of fine-tuning. We employ a similar strategy to generate Figure 10: religion over the years dialogues based on the subtitles collected from more than 2000 films, using GPT-2 [16].…”
Section: Future Directionsmentioning
confidence: 99%