We present PANGU-CODER, a pretrained decoder-only language model adopting the PANGU-α architecture for text-to-code generation, i.e. the synthesis of programming language solutions given a natural language problem description. We train PANGU-CODER using a two-stage strategy: the first stage employs Causal Language Modelling (CLM) to pre-train on raw programming language data, while the second stage uses a combination of Causal Language Modelling and Masked Language Modelling (MLM) training objectives that focus on the downstream task of text-to-code generation and train on loosely curated pairs of natural language program definitions and code functions. Finally, we discuss PANGU-CODER-FT, which is fine-tuned on a combination of competitive programming problems and code with continuous integration tests. We evaluate PANGU-CODER with a focus on whether it generates functionally correct programs and demonstrate that it achieves equivalent or better performance than similarly sized models, such as CodeX [16], while attending a smaller context window and training on less data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.