2021
DOI: 10.48550/arxiv.2110.08118
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Few-Shot Bot: Prompt-Based Learning for Dialogue Systems

Andrea Madotto,
Zhaojiang Lin,
Genta Indra Winata
et al.

Abstract: Learning to converse using only a few examples is a great challenge in conversational AI. The current best conversational models, which are either good chit-chatters (e.g., Blender-Bot) or goal-oriented systems (e.g., MinTL), are language models (LMs) fine-tuned on large conversational datasets. Training these models is expensive, both in terms of computational resources and time, and it is hard to keep them up to date with new conversational skills.A simple yet unexplored solution is prompt-based few-shot lea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 53 publications
0
2
0
Order By: Relevance
“…This approach was used for multiple NLP tasks shown in the work [15]. Some more specific use cases include relation classification [24] or dialogue systems [25].…”
Section: The Persona Pattern Examplementioning
confidence: 99%
“…This approach was used for multiple NLP tasks shown in the work [15]. Some more specific use cases include relation classification [24] or dialogue systems [25].…”
Section: The Persona Pattern Examplementioning
confidence: 99%
“…An LLM such as OpenAI's GPT (Generative Pre-trained Transformer) is basically trained to generate text, or rather to answer questions with paragraphs of text (Guan et al, 2020). Once trained, it can generate complete sentences and paragraphs that are coherent and, in many cases, indistinguishable from those written by humans, simply from an initial stimulus or prompt (Madotto et al, 2021).…”
Section: What Is Generative Ai?mentioning
confidence: 99%
“…Inspired by the prompt learning approaches with PLM for the English language [15,18,22,26,34,36,46], we propose to preserve the multilinguality of mPLM by bridging the gap between the unsupervised pre-training tasks and downstream tasks (e.g. dialogue generation) in the fine-tuning stage (i.e.…”
Section: Prompt Learning To Mitigate Catastrophic Forgetting In Fs-xl...mentioning
confidence: 99%