2021 8th International Conference on Dependable Systems and Their Applications (DSA) 2021
DOI: 10.1109/dsa52907.2021.00013
|View full text |Cite
|
Sign up to set email alerts
|

ComFormer: Code Comment Generation via Transformer and Fusion Method-based Hybrid Code Representation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
33
0
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 21 publications
(34 citation statements)
references
References 42 publications
0
33
0
1
Order By: Relevance
“…Code-todescription Documentation generation [21], [34], [44], [66], [69], [83], [4], [89], [95] Code-to-code APR [31], [38], [43], [71], [85],…”
Section: Description-tocodementioning
confidence: 99%
See 1 more Smart Citation
“…Code-todescription Documentation generation [21], [34], [44], [66], [69], [83], [4], [89], [95] Code-to-code APR [31], [38], [43], [71], [85],…”
Section: Description-tocodementioning
confidence: 99%
“…RNN LSTM [21], [24], [34], [43], [44], [47], [50], [66], [67], [73], [97], [111] GRU [31], [69] Other [18], [114] Transformer [2] [47], [75], [76], [79], [83], [85], [4], [89], [91], [95], [102], [114] CNN [24], [29], [38], [66], [69], [89], [114] Other neural MLP [59], [63] Graph NN [69] Neural trees…”
Section: Model Type Sub-type Studiesmentioning
confidence: 99%
“…Finally, they used beam search to improve comment quality. Yang et al [27] proposed a novel method ComFormer based on Transformer and fusion methodbased hybrid code presentation LeClair et al [28] considered the graph neural network, based on the graph2seq model [29]. Ahmad et al [30] considered the Vanilla-Transformer architecture, and used a relative position representation and copy mechanism.…”
Section: B Deep Learning-based Text Summarization and Source Code Sum...mentioning
confidence: 99%
“…For example, the identifiers satisfying the Camel casing naming convention are split into multiple words to alleviate the OOV problem. Yang et al [39] proposed a novel method ComFormer based on Transformer and fusion method-based hybrid code presentation. Recently, Kang et al [40] investigated whether using the pre-trained word embedding could improve the model performance.…”
Section: B Code Generation and Code Summarizationmentioning
confidence: 99%
“…Hu et al [58] added the pointer network [59] to the Transformer to solve this problem. Yang et al [39] alleviated the OOV problem with the help of subword tokenize method (i.e., Byte BPE [60]). However, due to the nature of shellcode statements, it is more reliable to fix this type of error based on rules.…”
Section: Rule-based Repair Componentmentioning
confidence: 99%