Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering Companion 2017
DOI: 10.1145/3053600.3053614
|View full text |Cite
|
Sign up to set email alerts
|

In-Test Adaptation of Workload in Enterprise Application Performance Testing

Abstract: Performance testing is used to assess if an enterprise application can fulfil its expected Service Level Agreements. However, since some performance issues depend on the input workloads, it is common to use time-consuming and complex iterative test methods, which heavily rely on human expertise. This paper presents an automated approach to dynamically adapt the workload so that issues (e.g. bottlenecks) can be identified more quickly as well as with less effort and expertise. We present promising results from … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
2
2
1

Relationship

4
1

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 13 publications
0
8
0
Order By: Relevance
“…Meanwhile, the work presented in [33] describes the performance analysis of software systems as a comparison of two versions of software and their performance results to find possible (regression) bugs. In response to these limitations, approaches that dynamically adapt the workload have been proposed [3,22]. This paper conducts an evaluation of DYNAMO because of its advantages of adjusting the workload on the fly using the analysis of key performance metrics to create a customized workload for the AUT with minimal knowledge or experience from the tester.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Meanwhile, the work presented in [33] describes the performance analysis of software systems as a comparison of two versions of software and their performance results to find possible (regression) bugs. In response to these limitations, approaches that dynamically adapt the workload have been proposed [3,22]. This paper conducts an evaluation of DYNAMO because of its advantages of adjusting the workload on the fly using the analysis of key performance metrics to create a customized workload for the AUT with minimal knowledge or experience from the tester.…”
Section: Background and Related Workmentioning
confidence: 99%
“…(2) The second type of run used the preliminary version of DYNAMO based on heuristic policies derived from PetStore (referred as h-DYNAMO [15]). (3) The third type of run used the work proposed in this paper which adopts our new adaptive logic (DYNAMO).…”
Section: Log Outmentioning
confidence: 99%
“…To address this challenge, our research has focused on developing techniques that improve the identification of workload-dependent performance issues, as well as their root causes, in order to increase the productivity of testers (hereinafter referred as users) by reducing the effort and expertise required in this process. In a previous work [15], we proposed an automated approach which dynamically adapts the workload used by a testing tool. However, that preliminary version was based on heuristic policies derived from the studied AUT; hence, it was not practical for real-world usage (as it was not application-independent).…”
Section: Introductionmentioning
confidence: 99%
“…Finally, other research works have aimed to reduce the expertise and effort needed to conduct useful testing. For example, by eliminating the need of manually configuring a diagnosis tool [20], or setting an appropriate test workload [21]. In contrast to these works, which have aimed to improve other aspects of the testing process, our tool has been designed with the aim of facilitating the creation of useful cluster test environments.…”
Section: B Related Workmentioning
confidence: 99%