2020
DOI: 10.1609/aaai.v34i05.6430
|View full text |Cite
|
Sign up to set email alerts
|

TreeGen: A Tree-Based Transformer Architecture for Code Generation

Abstract: A code generation system generates programming language code based on an input natural language description. State-of-the-art approaches rely on neural networks for code generation. However, these code generators suffer from two problems. One is the long dependency problem, where a code element often depends on another far-away code element. A variable reference, for example, depends on its definition, which may appear quite a few lines before. The other problem is structure modeling, as programs contain rich … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
69
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 130 publications
(76 citation statements)
references
References 14 publications
0
69
0
Order By: Relevance
“…Typically, propose TRANX, which introduces ASTs as intermediate representations of codes and has become the most influential Seq2Tree model. Then, Sun et al (2019Sun et al ( , 2020 respectively explore CNN and Transformer 2020) exploit external knowledge to enhance neural code generation model. Generally, all these Seq2Tree models generate ASTs in pre-order traversal, which, how-ever, is not suitable to handle all multi-branch AST nodes.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Typically, propose TRANX, which introduces ASTs as intermediate representations of codes and has become the most influential Seq2Tree model. Then, Sun et al (2019Sun et al ( , 2020 respectively explore CNN and Transformer 2020) exploit external knowledge to enhance neural code generation model. Generally, all these Seq2Tree models generate ASTs in pre-order traversal, which, how-ever, is not suitable to handle all multi-branch AST nodes.…”
Section: Related Workmentioning
confidence: 99%
“…* Equal contribution † Corresponding author models, and obtains great success (Ling et al, 2016;Lapata, 2016, 2018;Rabinovich et al, 2017;Yin and Neubig, 2017Hayati et al, 2018;Sun et al, 2019Sun et al, , 2020Wei et al, 2019;Shin et al, 2019;Xu et al, 2020;Xie et al, 2021). Specifically, an encoder is first used to learn word-level semantic representations of the input NL description.…”
Section: Introductionmentioning
confidence: 99%
“…The authors evaluate their model in machine translation and text classification. Sun et al (2020) develop a tree-structured transformer encoder-decoder architecture for code generation. Here, the tree structure is based on the code syntax.…”
Section: Related Workmentioning
confidence: 99%
“…Hu et al [19] proposed LSTM based code completion tool by inducing tokens at character and token levels, thereby reducing vocabulary size. Sun et al [51] developed neural code generation model that implements attention mechanism and tree-based AST reader. These prior efforts targeted imperative languages.…”
Section: Related Workmentioning
confidence: 99%