Findings of the Association for Computational Linguistics: EMNLP 2020 2020
DOI: 10.18653/v1/2020.findings-emnlp.217
|View full text |Cite
|
Sign up to set email alerts
|

ProphetNet: Predicting Future N-gram for Sequence-to-SequencePre-training

Abstract: This paper presents a new sequence-tosequence pre-training model called Prophet-Net, which introduces a novel self-supervised objective named future n-gram prediction and the proposed n-stream self-attention mechanism.Instead of optimizing one-stepahead prediction in the traditional sequenceto-sequence model, the ProphetNet is optimized by n-step ahead prediction that predicts the next n tokens simultaneously based on previous context tokens at each time step. The future n-gram prediction explicitly encourages… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
193
0
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 229 publications
(194 citation statements)
references
References 22 publications
(33 reference statements)
0
193
0
1
Order By: Relevance
“…To address the limitations of the Question Answering System, we propose a Question Similarity mechanism. The possible generated questions are from the state-of-the-art question generation system called ProphetNet [29] and the Question posed is from the SQuAD 2.0 dataset. The Question Similarity mechanism calculates the cosine similarity between the possible generated questions from the given paragraph and the question posed.…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…To address the limitations of the Question Answering System, we propose a Question Similarity mechanism. The possible generated questions are from the state-of-the-art question generation system called ProphetNet [29] and the Question posed is from the SQuAD 2.0 dataset. The Question Similarity mechanism calculates the cosine similarity between the possible generated questions from the given paragraph and the question posed.…”
Section: Methodsmentioning
confidence: 99%
“…In recent years, several works are proposed to tackle world knowledge by combining search factors based on bigram hashing, TF-IDF matching [7] and machine reading comprehension [22,29]. It brought the Question Answering System a good beginning.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations