Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.524
|View full text |Cite
|
Sign up to set email alerts
|

Generating similes effortlessly like a Pro: A Style Transfer Approach for Simile Generation

Abstract: Literary tropes, from poetry to stories, are at the crux of human imagination and communication. Figurative language, such as a simile, goes beyond plain expressions to give readers new insights and inspirations. We tackle the problem of simile generation. Generating a simile requires proper understanding for effective mapping of properties between two concepts. To this end, we first propose a method to automatically construct a parallel corpus by transforming a large number of similes collected from Reddit to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
41
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 42 publications
(60 citation statements)
references
References 32 publications
0
41
0
Order By: Relevance
“…ST datasets that consist of parallel pairs in different styles include: GYAFC for formality (Rao and Tetreault, 2018), Yelp (Shen et al, 2017) and Amazon Product Reviews for sentiment (He and McAuley, 2016), political slant and gender controlled datasets (Prabhumoye et al, 2018), Expert Style Transfer (Cao et al, 2020), PASTEL for imitating personal (Kang et al, 2019), SIMILE for simile generation (Chakrabarty et al, 2020), and others.…”
Section: Related Workmentioning
confidence: 99%
“…ST datasets that consist of parallel pairs in different styles include: GYAFC for formality (Rao and Tetreault, 2018), Yelp (Shen et al, 2017) and Amazon Product Reviews for sentiment (He and McAuley, 2016), political slant and gender controlled datasets (Prabhumoye et al, 2018), Expert Style Transfer (Cao et al, 2020), PASTEL for imitating personal (Kang et al, 2019), SIMILE for simile generation (Chakrabarty et al, 2020), and others.…”
Section: Related Workmentioning
confidence: 99%
“…With regard to paraphrasing, Stowe et al (2020) use a metaphor masking process to generate parallel training data in which key metaphoric words are hidden, causing the resulting seq2seq model to generate metaphoric words. Chakrabarty et al (2020) build a simile generation system based on pretrained seq2seq models. Similarly, the MER-MAID system uses a semi-supervised data collection method to generate metaphoric pairs, using them to fine-tune a BART-based seq2seq model (Chakrabarty et al, 2021).…”
Section: Metaphor Generationmentioning
confidence: 99%
“…To the best of our knowledge, we are the first to work on hyperbole generation. The closest work is that of Chakrabarty et al (2020b), who propose an end-to-end approach for simile generation that also utilizes commonsense knowledge predicted by COMeT (Bosselut et al, 2019). However, they only utilize the PROPERTY relation to replace certain parts of literal sentences.…”
Section: Figurative Generationmentioning
confidence: 99%
“…pared to the many efforts on other figurative languages such as puns, sarcasms, metaphors and similes (He et al, 2019;Chakrabarty et al, 2020a;Su et al, 2020;Yu and Wan, 2019;Chakrabarty et al, 2020b), the exploration of hyperboles is still in the infancy stage: NLP researchers have just started to look at automatic hyperbole detection (Troiano et al, 2018;Kong et al, 2020). According to Claridge (2010), hyperboles are divided into two categories: those at the word or phrase level and those at the clause or sentence level.…”
Section: Introductionmentioning
confidence: 99%