2019
DOI: 10.1609/aaai.v33i01.33017055
|View full text |Cite
|
Sign up to set email alerts
|

A Grammar-Based Structural CNN Decoder for Code Generation

Abstract: Code generation maps a program description to executable source code in a programming language. Existing approaches mainly rely on a recurrent neural network (RNN) as the decoder. However, we find that a program contains significantly more tokens than a natural language sentence, and thus it may be inappropriate for RNN to capture such a long sequence. In this paper, we propose a grammar-based structural convolutional neural network (CNN) for code generation. Our model generates a program by predicting the gra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
70
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 98 publications
(70 citation statements)
references
References 4 publications
0
70
0
Order By: Relevance
“…Gu et al [11] applied deep learning to code search. Sun et al [40] proposed a CNN-based model for code generation.…”
Section: Deep Learning For Source Codementioning
confidence: 99%
“…Gu et al [11] applied deep learning to code search. Sun et al [40] proposed a CNN-based model for code generation.…”
Section: Deep Learning For Source Codementioning
confidence: 99%
“…Rabinovich et al (2017) presented a abstract syntax network that combines edge information for code generation. Convolution neural networks (CNNs) were used for code generation decoding because the output program is much longer than semantic parsing and MWPs, and RNNs suffer from the long dependency problem (Sun et al, 2018).…”
Section: Seq2tree Architecturesmentioning
confidence: 99%
“…The neural network produces distributed representations of inputs and outputs, which are used to pick the appropriate function and corresponding arguments. Sun et al [2018] constrain the search for programs using a neural network by integrating a grammar into their inference procedure. Their neural network learns to select the appropriate grammar production to expand a non-terminal in an intermediate program representation.…”
Section: Neural Program Synthesismentioning
confidence: 99%
“…This approach is closer in spirit to the conditional program generation used in Bayou. Additionally, with the exception of Sun et al [2018], many of these systems have focused on a different domain for evaluation: string manipulation tasks.…”
Section: Neural Program Synthesismentioning
confidence: 99%