Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.121
|View full text |Cite
|
Sign up to set email alerts
|

Attend, Translate and Summarize: An Efficient Method for Neural Cross-Lingual Summarization

Abstract: Cross-lingual summarization aims at summarizing a document in one language (e.g., Chinese) into another language (e.g., English). In this paper, we propose a novel method inspired by the translation pattern in the process of obtaining a cross-lingual summary. We first attend to some words in the source text, then translate them into the target language, and summarize to get the final summary. Specifically, we first employ the encoder-decoder attention distribution to attend to the source words. Second, we pres… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
33
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 31 publications
(35 citation statements)
references
References 25 publications
(29 reference statements)
2
33
0
Order By: Relevance
“…The method proposed by Cao et al (2020) requires not only parallel summaries but also document pairs translated by MT systems. Another method proposed by Zhu et al (2020) requires bilingual lexicons extracted from large parallel MT datasets (2.08M sentence pairs from eight LDC corpora). We choose not to use these models as baselines since comparing MCLAS with them is unfair.…”
Section: Baselinesmentioning
confidence: 99%
See 1 more Smart Citation
“…The method proposed by Cao et al (2020) requires not only parallel summaries but also document pairs translated by MT systems. Another method proposed by Zhu et al (2020) requires bilingual lexicons extracted from large parallel MT datasets (2.08M sentence pairs from eight LDC corpora). We choose not to use these models as baselines since comparing MCLAS with them is unfair.…”
Section: Baselinesmentioning
confidence: 99%
“…Cross-lingual summarization (CLS) helps people efficiently grasp salient information from articles in a foreign language. Neural approaches to CLS require large scale datasets containing millions of cross-lingual document-summary pairs (Zhu et al, 2019; Cao et al, 2020;Zhu et al, 2020). However, two challenges arise with these approaches: 1) most languages are low-resource, thereby lacking document-summary paired data; 2) large parallel datasets across different languages for neuralbased CLS are rare and expensive, especially under the current trend of neural networks.…”
Section: Introductionmentioning
confidence: 99%
“…CopyNet is also widely used in text summarization (See et al, 2017;Zhu et al, 2020), automatic postediting (Huang et al, 2019), grammar correction (Zhao et al, 2019a) and so on.…”
Section: Copynetmentioning
confidence: 99%
“…From a high-level perspective, our methods share a similar Transformer-based architecture with Huang et al (2019) and Zhu et al (2020). Huang et al (2019) employed CopyNet to copy from a draft generated by a pre-trained NMT system.…”
Section: Copynetmentioning
confidence: 99%
See 1 more Smart Citation