2022
DOI: 10.48550/arxiv.2203.09707
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

M2TS: Multi-Scale Multi-Modal Approach Based on Transformer for Source Code Summarization

Abstract: Source code summarization aims to generate natural language descriptions of code snippets. Many existing studies learn the syntactic and semantic knowledge of code snippets from their token sequences and Abstract Syntax Trees (ASTs). They use the learned code representations as input to code summarization models, which can accordingly generate summaries describing source code. Traditional models traverse ASTs as sequences or split ASTs into paths as input. However, the former loses the structural properties of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 40 publications
0
2
0
Order By: Relevance
“…M2TS [ 10 ] constructing a multi-view AST feature at multiple local and global levels and proposing a fusion method to combine sequential information and structural information.…”
Section: Experiments and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…M2TS [ 10 ] constructing a multi-view AST feature at multiple local and global levels and proposing a fusion method to combine sequential information and structural information.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…Ahmad et al [ 8 ] first proposed a transformer-based method on the code summarization task, which achieved excellent performance and leads the code summarization area into the transformer-based model stage. Because of the popularization and performance of transformers, almost all recent works [ 9 , 10 , 11 , 12 ] are conducted based on the transformer architecture and achieve high scores in each evaluation metric. However, only considering sequence information without considering the structure of code leads to a incomplete representation of code.…”
Section: Introductionmentioning
confidence: 99%