2019
DOI: 10.1155/2019/6230953
|View full text |Cite
|
Sign up to set email alerts
|

Software Defect Prediction via Attention-Based Recurrent Neural Network

Abstract: In order to improve software reliability, software defect prediction is applied to the process of software maintenance to identify potential bugs. Traditional methods of software defect prediction mainly focus on designing static code metrics, which are input into machine learning classifiers to predict defect probabilities of the code. However, the characteristics of these artificial metrics do not contain the syntactic structures and semantic information of programs. Such information is more significant than… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
64
0
3

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 86 publications
(67 citation statements)
references
References 40 publications
(43 reference statements)
0
64
0
3
Order By: Relevance
“…[ 2 ] assembled a software defect prediction algorithm based on AST based on the deep tree. Fan et al [ 10 ] the attention-based recurrent neural network is applied to the vector encoding structure of the code as a deep learning model. They built the AST code and converted it into a digital vector.…”
Section: Background and Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…[ 2 ] assembled a software defect prediction algorithm based on AST based on the deep tree. Fan et al [ 10 ] the attention-based recurrent neural network is applied to the vector encoding structure of the code as a deep learning model. They built the AST code and converted it into a digital vector.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Functions that include these types of structural information and the semantics of defect prediction should improve performance. The rich functions of code semantics and syntactic structure have specific statistical functions, and ASTs [ 10 ] hides these specific statistical functions, which can help locate and analyze faults more accurately.…”
Section: Introductionmentioning
confidence: 99%
“…The LSTM network produces a plausible outcome for source code bug detection. Fan et al [30] presented an attention-based RNN for source code defect prediction. F-measure score and the area under the curve (AUC) were used as model evaluation metrics.…”
Section: Background and Literature Reviewmentioning
confidence: 99%
“…They also replaced rare tokens with a special token to compact the corpus. Fan et al [29] considered that the hidden features extracted from the AST contain key syntax and semantics, so they employed bidirectional long short-term memory (Bi-LSTM) with attention mechanism to discriminate key syntax and semantics features. They record plain texts for method invocation nodes and extracted the node names as tokens for all the nodes of declarations.…”
Section: B Abstract Syntax Tree In Defect Predictionmentioning
confidence: 99%