Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 2021
DOI: 10.18653/v1/2021.findings-acl.251
|View full text |Cite
|
Sign up to set email alerts
|

Learning Sequential and Structural Information for Source Code Summarization

Abstract: We propose a model that learns both the sequential and the structural features of code for source code summarization. We adopt the abstract syntax tree (AST) and graph convolution to model the structural information and the Transformer to model the sequential information. We convert code snippets into ASTs and apply graph convolution to obtain structurally-encoded node representations. Then, the sequences of the graphconvolutioned AST nodes are processed by the Transformer layers. Since structurallyneighboring… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 19 publications
(19 citation statements)
references
References 14 publications
(27 reference statements)
0
16
0
Order By: Relevance
“…Consequently, the power of the corresponding code-to-text models may be affected. To this end, more and more researchers have started to train the learning models with code's structural information, which can be obtained from the code's AST 16,17,[19][20][21][22]25,[31][32][33][34][35] and the Application Programming Interface (API) sequence. 13,36,37 For example, Wan et al 21 put forward a framework based on reinforcement learning to automatically summarize code snippets.…”
Section: Related Workmentioning
confidence: 99%
“…Consequently, the power of the corresponding code-to-text models may be affected. To this end, more and more researchers have started to train the learning models with code's structural information, which can be obtained from the code's AST 16,17,[19][20][21][22]25,[31][32][33][34][35] and the Application Programming Interface (API) sequence. 13,36,37 For example, Wan et al 21 put forward a framework based on reinforcement learning to automatically summarize code snippets.…”
Section: Related Workmentioning
confidence: 99%
“…The encodings are then aggregated for generating the summary. Choi et al [11] apply graph convolutions to obtain node representations from an AST and then input the sequence of node representations into the Transformer layers for code summarization. Wu et al [44] construct a multi-view adjacent matrix to represent the relationships between the tokens in source code, and use it to guide the self-attention computation in Transformer.…”
Section: Related Workmentioning
confidence: 99%
“…The encodings are then aggregated for generating the summary. • mAST+GCN [11]. The model applies graph convolutions to obtain node representations from an AST and then inputs the sequence of node representations into the Transformer layers for code summarization.…”
Section: The Comparative Modelsmentioning
confidence: 99%
“…Past works handling programs and code have focused on enriching their models with incorporating more semantic and syntactic information from code [1,12,33,43]. Some prior works have cast the SCS as a sequence classification task, where the code is represented as a textual sequence and input pair (q i , c j ) is concatenated with a special separator symbol into a single sequence, and the output is the score rij : rij = f (q i , c j ).…”
Section: Neural Models For Semantic Code Searchmentioning
confidence: 99%