Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing 2023
DOI: 10.18653/v1/2023.emnlp-main.1013
|View full text |Cite
|
Sign up to set email alerts
|

Exploring Distributional Shifts in Large Language Models for Code Analysis

Shushan Arakelyan,
Rocktim Das,
Yi Mao
et al.

Abstract: We systematically study how three large language models with code capabilities -CodeT5, Codex, and ChatGPT -generalize to out-ofdomain data. We consider two fundamental applications -code summarization, and code generation. We split data into domains following its natural boundaries -by an organization, by a project, and by a module within the software project. We establish that samples from each new domain present all the models with a significant challenge of distribution shift. We study how established meth… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 23 publications
(32 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?