2013 17th European Conference on Software Maintenance and Reengineering 2013
DOI: 10.1109/csmr.2013.54
|View full text |Cite
|
Sign up to set email alerts
|

Profile-Based, Load-Independent Anomaly Detection and Analysis in Performance Regression Testing of Software Systems

Abstract: Abstract-Performance evaluation through regression testing is an important step in the software production process. It aims to make sure that the performance of new releases do not regress under a field-like load. The main outputs of regression tests are the metrics that represent the response time of various transactions as well as the resource utilization (CPU, disk I/O and Network). In this paper, we propose to use a concept known as Transaction Profile, which can provide a detailed representation for the t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2013
2013
2019
2019

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 22 publications
(8 citation statements)
references
References 11 publications
0
8
0
Order By: Relevance
“…Then they compare the pairwise correlations between the vectors in the target run with those of the baseline run. Ghaith et al [32] proposed the use of queuing networks to detect performance regression. Jiang et al [5,33] introduced approaches to automatically detect anomaly in a performance test.…”
Section: Related Workmentioning
confidence: 99%
“…Then they compare the pairwise correlations between the vectors in the target run with those of the baseline run. Ghaith et al [32] proposed the use of queuing networks to detect performance regression. Jiang et al [5,33] introduced approaches to automatically detect anomaly in a performance test.…”
Section: Related Workmentioning
confidence: 99%
“…The "no-worse-than-before" principle states that the average response time (system performance requirements) for the current version should be at least as good as prior versions [26]. [15], [38], [39], [40], [41], [42], [135], [174], [173], [177], [178] Comparing Against Derived (Threshold and/or target) Data Number of pass/fail requests, past performance metrics Detecting violations in performance and reliability requirements [15], [16], [26], [179] Detecting [86], [181], [182], [183], [184], [185], [186], [187], [188], [ Potential problematic log lines causing memory-related problems [190] • Deriving Target Data…”
Section: Comparing Against Derived Datamentioning
confidence: 99%
“…By this we extend on our previous work [9] where we suggested measuring the TP using a load generator software.…”
Section: Resources Utilization (Ru)mentioning
confidence: 96%