2023
DOI: 10.1142/s0218539323500055
|View full text |Cite
|
Sign up to set email alerts
|

Intelligent Software Bug Prediction Framework with Parameter-Tuned LSTM with Attention Mechanism Using Adaptive Target-Based Pooling Deep Features

Abstract: In recent years, various researchers have designed a software bug prediction model for classifying the nonfaulty and faulty modules in software that are correlated with software constraints. Software bug or defect prediction helps programmers or developers discover the possibilities of bugs and minimize maintenance costs. However, most approaches do not solve the class-imbalance issue regarding the software bug prediction model. To solve these issues, the latest software bug prediction model using enhanced dee… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 40 publications
0
0
0
Order By: Relevance
“…This enables the model to capture the time series information in the input data and automatically select the most relevant features for prediction [43][44][45]. The superior precision of the CLAP prediction model stems from two fundamental components: the incorporation of the attention mechanism and the synergistic combination of CNN and LSTM neural network architectures [46][47][48][49]. The attention mechanism plays a pivotal role by assigning varying attention weights to different segments of the input data.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…This enables the model to capture the time series information in the input data and automatically select the most relevant features for prediction [43][44][45]. The superior precision of the CLAP prediction model stems from two fundamental components: the incorporation of the attention mechanism and the synergistic combination of CNN and LSTM neural network architectures [46][47][48][49]. The attention mechanism plays a pivotal role by assigning varying attention weights to different segments of the input data.…”
Section: Resultsmentioning
confidence: 99%
“…The attention mechanism plays a pivotal role by assigning varying attention weights to different segments of the input data. This enables the model to effectively diminish the influence of irrelevant information and instead prioritize relevant and important details during the processing and learning tasks [46][47][48][49]. The attention mechanism was integrated into CLAP, allowing it to selectively concentrate on significant aspects within the input experimental data [46][47][48][49].…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation