2021
DOI: 10.48550/arxiv.2109.09707
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Plug-and-Play Method for Controlled Text Generation

Abstract: Large pre-trained language models have repeatedly shown their ability to produce fluent text. Yet even when starting from a prompt, generation can continue in many plausible directions. Current decoding methods with the goal of controlling generation, e.g., to ensure specific words are included, either require additional models or fine-tuning, or work poorly when the task at hand is semantically unconstrained, e.g., story generation. In this work, we present a plug-and-play decoding method for controlled langu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 7 publications
(13 reference statements)
0
1
0
Order By: Relevance
“…Later research then proposed using different methods for text generation control that do not demand re-training the large-scale NLP models, one of which is Keyword2Text (K2T) [18]. K2T uses keywords as hard and soft constraints for text generation.…”
Section: Controlling Text Generationmentioning
confidence: 99%
“…Later research then proposed using different methods for text generation control that do not demand re-training the large-scale NLP models, one of which is Keyword2Text (K2T) [18]. K2T uses keywords as hard and soft constraints for text generation.…”
Section: Controlling Text Generationmentioning
confidence: 99%
“…Transformer [1] is a deep learning language model that uses the mechanism of attention, which gives a different weight of significance to each part of the input data. By solving the recursion and lack of global dependency problem of recurrent neural network (RNN) [2] and long short-term memory (LSTM) [3], the transformer is becoming the de facto standard for natural language processing (NLP) applications such as text generation [4], [5], text classification [6], [7], and machine translation [8], [9]. Among them, text generation, broadly referred to as natural language generation (NLG), is related to the automatic generation of human-readable text by a computer.…”
Section: Introductionmentioning
confidence: 99%