2022
DOI: 10.48550/arxiv.2204.13509
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the Effect of Pretraining Corpora on In-context Learning by a Large-scale Language Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…It is usually difficult for LLM to perform well on domain-specific tasks as our hint-text generation, and a common practice would be employing the in-context learning [42,94,119] schema to boost the performance. It provides the LLM with examples to demonstrate what the instruction is to enable the LLM to better understand the task.…”
Section: Enriching Prompt With Examplesmentioning
confidence: 99%
See 1 more Smart Citation
“…It is usually difficult for LLM to perform well on domain-specific tasks as our hint-text generation, and a common practice would be employing the in-context learning [42,94,119] schema to boost the performance. It provides the LLM with examples to demonstrate what the instruction is to enable the LLM to better understand the task.…”
Section: Enriching Prompt With Examplesmentioning
confidence: 99%
“…To achieve this, we first build a basic example dataset of hint-text from the popular mobile apps in our motivational study. Research shows that the quality and relevance of examples can significantly affect the performance of LLM [42,94,119]. Therefore, based on the dataset we built, a retrieval-based example selection method (in Section 5.2.2) is designed to select the most appropriate example according to the text input and its hint-text, which further enables the LLM to learn with pertinence.…”
Section: Enriching Prompt With Examplesmentioning
confidence: 99%