2023
DOI: 10.48550/arxiv.2302.04048
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Automating Code-Related Tasks Through Transformers: The Impact of Pre-training

Abstract: Transformers have gained popularity in the software engineering (SE) literature. These deep learning models are usually pre-trained through a self-supervised objective, meant to provide the model with basic knowledge about a language of interest (e.g., Java). A classic pre-training objective is the masked language model (MLM), in which a percentage of tokens from the input (e.g., a Java method) is masked, with the model in charge of predicting them. Once pre-trained, the model is then finetuned to support the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 44 publications
(59 reference statements)
0
2
0
Order By: Relevance
“…Researchers have been investigating the potential of utilizing pre-trained large language models (LLMs) to assist software engineers in various aspects of software development. This includes tasks such as code generation [40], completion [41], summarization [42], repair [43], bug fixing [16], and code review [44]. In fact, LLMs have shown remarkable capabilities, outperforming many existing state-of-the-art methods in various code-related tasks.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Researchers have been investigating the potential of utilizing pre-trained large language models (LLMs) to assist software engineers in various aspects of software development. This includes tasks such as code generation [40], completion [41], summarization [42], repair [43], bug fixing [16], and code review [44]. In fact, LLMs have shown remarkable capabilities, outperforming many existing state-of-the-art methods in various code-related tasks.…”
Section: Related Workmentioning
confidence: 99%
“…Transformers have revolutionized Natural Language Processing (NLP) and have found applications in various other fields [63], [64]. In software engineering, transformers can be particularly beneficial for tasks like code summarization [42], defect prediction [16], and automated code generation [40]. Their ability to capture long-range dependencies and contextual information makes them highly effective for these complex tasks.…”
Section: A Transformer-based Model For Cnn Bug Localizationmentioning
confidence: 99%