2022
DOI: 10.1007/978-3-031-10542-5_6
|View full text |Cite
|
Sign up to set email alerts
|

Fine-Tuning GPT-2 to Patch Programs, Is It Worth It?

Abstract: The application of Articial Intelligence (AI) in the Software Engineering (SE) eld is always a bit delayed compared to state-ofthe-art research results. While the Generative Pre-trained Transformer (GPT-2) model was published in 2018, only a few recent works used it to SE tasks. One of such task is Automated Program Repair (APR), where the applied technique should nd a x to software bugs without human intervention. One problem emerges here: the creation of proper training data is resource intensive and require… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…3 learnt an internal representation of the English dialect that can at that point be utilized to extricate highlights valuable for downstream errands. The demonstrate is best at what it was pretrained for be that as it may, which is producing writings from a provoke [20,21,22].…”
Section: Gpt-2mentioning
confidence: 99%
See 1 more Smart Citation
“…3 learnt an internal representation of the English dialect that can at that point be utilized to extricate highlights valuable for downstream errands. The demonstrate is best at what it was pretrained for be that as it may, which is producing writings from a provoke [20,21,22].…”
Section: Gpt-2mentioning
confidence: 99%
“…The coming about dataset (called WebText) weights 40GB of writings but has not been freely discharged. Engineer can discover a list of the beat 1,000 spaces display on Hugging Face [22].…”
Section: Gpt-2mentioning
confidence: 99%