Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1043
|View full text |Cite
|
Sign up to set email alerts
|

This Email Could Save Your Life: Introducing the Task of Email Subject Line Generation

Abstract: Given the overwhelming number of emails, an effective subject line becomes essential to better inform the recipient of the email's content. In this paper, we propose and study the task of email subject line generation: automatically generating an email subject line from the email body. We create the first dataset for this task and find that email subject line generation favor extremely abstractive summary which differentiates it from news headline generation or news single document summarization. We then devel… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
31
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 38 publications
(32 citation statements)
references
References 42 publications
0
31
0
1
Order By: Relevance
“…Email Zhang and Tetreault (2019) introduced an abstractive business and personal email summarization dataset which consists of email and subject pairs. We collect the unlabeled email corpus from the Enron Email Dataset.…”
Section: Adaptsummentioning
confidence: 99%
See 2 more Smart Citations
“…Email Zhang and Tetreault (2019) introduced an abstractive business and personal email summarization dataset which consists of email and subject pairs. We collect the unlabeled email corpus from the Enron Email Dataset.…”
Section: Adaptsummentioning
confidence: 99%
“…Yet, despite their practicality, very few studies have used domain adaptation methods on the lowresource scenario for the abstractive summarization task. To address this research gap, we present AdaptSum, the first benchmark to simulate the low-resource domain Adaptation setting for abstractive Summarization systems with a combination of existing datasets across six diverse domains (dialog (Gliwa et al, 2019), email (Zhang and Tetreault, 2019), movie review (Wang and Ling, 2016), debate (Wang and Ling, 2016), social media (Kim et al, 2019), and science (Yasunaga et al, 2019)), and for each domain, we reduce the number of training samples to a small quantity so as to create a low-resource scenario.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In our Wikipedia section heading generation task, the prevalence of generic headings makes the task more abstractive than datasets like Gigaword (Rush et al, 2015), or even other short-text generation tasks, like email subject prediction (Zhang and Tetreault, 2019), which makes it a useful dataset for analyzing model performance. It is also extrinsically useful -most automated methods for improving Wikipedia focus on creating new content, such as through multi-document summarization or generating text from structured data (Lebret et al, 2016).…”
Section: Related Workmentioning
confidence: 99%
“…In our Wikipedia section heading generation task, the prevalence of generic headings makes the task more abstractive than datasets like Gigaword (Rush et al, 2015), or even other short-text generation tasks, like email subject prediction (Zhang and Tetreault, 2019), which makes it a useful dataset for analyzing model performance. It is also extrinsically useful -most automated methods for improving Wikipedia focus on creating new content, such as through multi-document summarization or generating text from structured data (Lebret et al, 2016).…”
Section: Related Workmentioning
confidence: 99%