2023
DOI: 10.1109/tr.2022.3154773
|View full text |Cite
|
Sign up to set email alerts
|

SeTransformer: A Transformer-Based Code Semantic Parser for Code Comment Generation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 20 publications
(24 citation statements)
references
References 43 publications
0
21
0
Order By: Relevance
“…Deepcom [6] utilizes the traditional SBT and Hybrid-DeepCom [11] created comments on Java methods through an encoder-decoder structure with attention. Yang et al [7] used Sim_SBT, and SeTransformer [4] used improved structure-based traversal (ISBT) to deduplicate SBT and improve its representation. They used the transformer [22] instead of an RNN, which has a long-term dependency on long input data.…”
Section: Learning-based Code Summarizationmentioning
confidence: 99%
See 4 more Smart Citations
“…Deepcom [6] utilizes the traditional SBT and Hybrid-DeepCom [11] created comments on Java methods through an encoder-decoder structure with attention. Yang et al [7] used Sim_SBT, and SeTransformer [4] used improved structure-based traversal (ISBT) to deduplicate SBT and improve its representation. They used the transformer [22] instead of an RNN, which has a long-term dependency on long input data.…”
Section: Learning-based Code Summarizationmentioning
confidence: 99%
“…Haiduc et al [18] used only lexical information using code tokens for comment generation. Hybrid-DeepCom [11], ComFormer [7], SeCNN [24], and SeTransformer [4] use the source code structure information as well as the source code in comment generation. These methods use lexical and syntactic information together.…”
Section: A Source Code Information 1) Code Sequencementioning
confidence: 99%
See 3 more Smart Citations