2022
DOI: 10.1016/j.jss.2022.111355
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical semantic-aware neural code representation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 23 publications
0
9
0
Order By: Relevance
“…Tree-LSTM processes structured input through tree-based units, with Child-Sum Tree-LSTM designed for high branching factor trees, and Binary Tree-LSTM suited for binary trees. Graph-LSTM [31] adapts the Tree-LSTM architecture for processing graph data, incorporating syntax and semantics while preserving the original structures. Nevertheless, LSTM architectures are significantly more complex in the hidden layer, resulting in approximately four times more parameters than a simple RNN architecture [32].…”
Section: Recurrent Neural Network (Rnns)mentioning
confidence: 99%
See 2 more Smart Citations
“…Tree-LSTM processes structured input through tree-based units, with Child-Sum Tree-LSTM designed for high branching factor trees, and Binary Tree-LSTM suited for binary trees. Graph-LSTM [31] adapts the Tree-LSTM architecture for processing graph data, incorporating syntax and semantics while preserving the original structures. Nevertheless, LSTM architectures are significantly more complex in the hidden layer, resulting in approximately four times more parameters than a simple RNN architecture [32].…”
Section: Recurrent Neural Network (Rnns)mentioning
confidence: 99%
“…CDLH was compared to Deckard [83], DLC [4], and SourcererCC [82] on the BigCloneBench and OJClone benchmarks, with results indicating that CDLH's superior performance in terms of F1-score on both benchmarks. Ullah et al [31] (S19) proposed a technique for source code representation using a semantic graph to capture both syntax and semantic features. Their study compared this technique to several others, including Deckard [83], DLC [4], SourcererCC [82], CDLH [63], FCCD [90], ASTNN [49], CodeBert [31] (S19) showed strong performance on OJClone, excelling in terms of precision, recall, and F1-score, closely followed by ASTNN [47].…”
Section: Plos Onementioning
confidence: 99%
See 1 more Smart Citation
“…The study provides valuable insights into the potential of semantic enrichment for improving data quality in sensor networks and offers a promising solution for overcoming data interoperability challenges in this domain. Furthermore, Jiang et al [20] discuss the use of neural networks to represent computer code in an hierarchical, semantically-aware manner. The authors proposed a method for improving computer code representation in order to better capture its meaning and structure.…”
Section: Related Workmentioning
confidence: 99%
“…Additionally, depending on the size and complexity of the solutions being compared, line-level or function-level comparisons might be appropriate. Related research on source code representation [26] has also considered methods such as control flow graphs and program dependence graphs. These approaches could potentially bring even more benefits to the comparison process, but they also present challenges in terms of visual representation and usability for programmers.…”
Section: B Using Artificial Intelligence To Generate Codementioning
confidence: 99%