2021
DOI: 10.48550/arxiv.2109.00363
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ConRPG: Paraphrase Generation using Contexts as Regularizer

Abstract: A long-standing issue with paraphrase generation is how to obtain reliable supervision signals. In this paper, we propose an unsupervised paradigm for paraphrase generation based on the assumption that the probabilities of generating two sentences with the same meaning given the same context should be the same. Inspired by this fundamental idea, we propose a pipelined system which consists of paraphrase candidate generation based on contextual language models, candidate filtering using scoring functions, and p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 54 publications
0
6
0
Order By: Relevance
“…Paraphrases express the surface forms of the underlying semantic content [6] and capture the essence of language diversity [46]. Early work on automatic generation of paraphrase are generally rule-based [7,8], but the recent trend brings to fore neural network solutions [47,19,10,12,13,20,6]. Current research for paraphrasing mainly focuses on supervised methods, which require the availability of a large number of source and target pairs.…”
Section: Paraphrase Generationmentioning
confidence: 99%
See 1 more Smart Citation
“…Paraphrases express the surface forms of the underlying semantic content [6] and capture the essence of language diversity [46]. Early work on automatic generation of paraphrase are generally rule-based [7,8], but the recent trend brings to fore neural network solutions [47,19,10,12,13,20,6]. Current research for paraphrasing mainly focuses on supervised methods, which require the availability of a large number of source and target pairs.…”
Section: Paraphrase Generationmentioning
confidence: 99%
“…Traditional solutions to paraphrase generation are generally rule-based [7,8], utilising lexical resources, such as WordNet [9], to find word replacements. The recent trend brings to fore neural network models [10,11,12,13], which are typically based on a sequence-to-sequence learning paradigm [14].…”
Section: Introductionmentioning
confidence: 99%
“…An ideal paraphrase not only needs to have the same semantics but also should have a significant change in expression from the input sentence (i.e., expression difference) (Bhagat and Hovy, 2013). Aiming at the problem of expression differences in generated sentences, researchers have made a lot of attempts in different dimensions (Lin and Wan, 2021;Li et al, 2019;Hosking and Lapata, 2021;Meng et al, 2021). For example, Li et al proposed multiple generators with different granularity levels to learn the mapping relationship between input and output respectively, and then combine them to complete the paraphrase generation task (Li et al, 2019).…”
Section: Related Workmentioning
confidence: 99%
“…Other unsupervised methods for paraphrase generation include VAE (VQ-VAE) (Roy and Grangier, 2019), latent bagof-words alignment (Fu et al, 2019) and simulated annealing (Liu et al, 2019a). Adapting large-scale pretraining (Devlin et al, 2018;Radford et al, 2018;Liu et al, 2019b;Clark et al, 2020;Sun et al, 2021b) to paraphrase generation has been recently investigated (Witteveen and Andrews, 2019;Hegde and Patil, 2020;Niu et al, 2020;Meng et al, 2021) and has shown promising potentials to improve generation quality. Our work is distantly related to unsupervised text style transfer (Hu et al, 2017;Mueller et al, 2017;Shen et al, 2017;Li et al, 2018a;Fu et al, 2018), where the model alters a specific text attribute of an input sentence (such as sentiment) while preserving other text attributes.…”
Section: Related Workmentioning
confidence: 99%